To export audit logs and usage reports to Google BigQuery, you need to set up a BigQuery Export configuration in the Google Admin console.
About BigQuery and Reports API data
There are a few differences between the data that’s available in the BigQuery dataset and the data retrieved from the Reports API. The BigQuery data only includes the unfiltered dataset. You can still filter the data using SQL, but not all Reports API parameters are supported.
You can filter the Reports API data by including parameters in the API request.
Example: There are two organizational units (OU) in a domain, OU-A and OU-B. Using Reports API and BigQuery, you can access all the events for the entire domain (OU-A and OU-B). However:
- With the Reports API, you can retrieve the OU-A events by using the orgUnitID parameter in the API request.
- With SQL and BigQuery, there is no way to filter events by OU, because there isn’t a corresponding column to the orgUnitID parameter.
- The BigQuery data includes historical data, which can also be retrieved from the Reports API.
- If you turn off exporting Google Workspace data to BigQuery, no new data is included in the BigQuery Export. However, existing data is available in other sources, such as the Reports API.
- Not all service report data is available in BigQuery Export. For a list of supported services, go to What services does BigQuery Export support? below.
- For examples of queries, go to Example queries for reporting logs in BigQuery.
How data is propagated and retained
- Policies can take an hour to propagate. After that, daily tables are automatically created in your dataset (Pacific Time).
- Data is saved following guidelines for other logs and reports. For details, go to Data retention and lag times.
- Data tables don’t automatically get deleted. To delete an active project, go to Delete a BigQuery Export configuration.
- BigQuery Exports collect Google Workspace data from the previous day's events. The result shows data from the previous day to the export date.
Set up BigQuery Export configuration
Before you begin
To set up a BigQuery Export configuration, you first need to set up a BigQuery project in the Cloud console. When creating the project, you add the firstname.lastname@example.org account as an editor, which is required to write logs and update the schema, and is needed for step 5 below.
For instructions, go to Set up a BigQuery project for reporting logs.
From the Admin console Home page, go to Reports.
- On the left, scroll down and click BigQuery Export.
- Point to the BigQuery Export card and click Edit .
- To activate BigQuery logs, check the Enable Google Workspace data export to Google BigQuery box. The logs will be available within 48 hours after the setting is turned on.
- Under BigQuery project ID, select the project where you want to store the logs. Choose a project with write access. If you don’t see the project, you need to set it up in BigQuery. For details, go to Quickstart using the Cloud Console.
- Under New dataset within project, enter the name of the dataset to use for storing the logs in the project. Dataset names must be unique for each project. For details, go to Creating datasets.
- Click Save.
Note: If you can’t save the project, go to the Cloud console, delete the new dataset, then save it again in the Admin console.
When the export is triggered, the dataset is created the next day. In addition to project owners, editors, and viewers, the email@example.com service account is added as editor. The service account is required to write logs and update the schema.
How do I set a data expiration for my exports?
Can I change a BigQuery project ID?
- Google Meet
- Apps Script
- Chrome OS
Note: We plan to support more audit logs, including Google Chat and Search.