Set up BigQuery logs in the Admin console

Supported editions for this feature: Enterprise; Enterprise for Education.  Compare your edition


To export logs and usage reports to Google BigQuery, you need to set up a BigQuery project in the Google Admin console. The data available in the BigQuery Export also includes historical data, and this data can also be retrieved from the Reports API. There are a few differences between the data that’s available in the BigQuery Export dataset versus the data retrieved from the Reports API.

If you decide to turn off the exporting of data to BigQuery, existing data is saved and no new data is made available in BigQuery. Existing data will still be available in other sources, such as the Reports API.

Note: Not all service report data is available in BigQuery Export. For a list of supported services and examples of queries, go to Example queries for reporting logs in BigQuery.

How data is propagated and retained

  • Policies can take an hour to propagate. After that, daily tables are automatically created in your dataset (Pacific Time).
  • Data is saved following guidelines for other logs and reports. For details, go to Data retention and lag times.
  • Data tables don’t automatically get deleted. To delete an active project, go to Delete a BigQuery Export configuration.
  • BigQuery Exports collect Google Workspace data from the previous day's events. The result shows data from the previous day to the export date.

Set up BigQuery in the Admin console

  1. Sign in to your Google Admin console.

    Sign in using an account with super administrator privileges (does not end in @gmail.com).

  2. From the Admin console Home page, go to Reports.
  3. On the left, scroll down and click BigQuery Export.
  4. To enable BigQuery logs, check the Enable Google Workspace data to Google BigQuery box. The logs will be available within 48 hours after the setting is turned on.
  5. Under BigQuery project ID, select the project where you want to store the logs. Choose a project with write access. If you don’t see the project, you need to set it up in BigQuery. For details, go to Quickstart using the Cloud Console.
  6. Under New dataset within project, enter the name of the dataset to use for storing the logs in the project. Dataset names must be unique for each project. For details, go to Creating datasets.
  7. Click Save.
    If you can’t save the project, you can try deleting the new dataset from the BigQuery console and saving again here.

When the export is triggered, the dataset is created the next day. In addition to project owners, editors, and viewers, the gapps-reports@system.gserviceaccount.com service account is added as editor. The service account is required to write logs and update the schema.

About BigQuery logs

The BigQuery Export only includes the unfiltered dataset, but you can filter out the data retrieved from the Reports API by including various parameters in the API request. You can still filter out the data in the BigQuery Export using SQL, but not all Reports API parameters are supported.

For example: Let's say there are two organizational units, OU-A and OU-B in a domain. You can access all of the events for the entire domain (including OU-A and OU-B) via both the Reports API and BigQuery Export. While it's possible to retrieve only the events of OU-A via the Reports API by including a parameter orgUnitID in the API request, there is no way to filter events by organizational unit using SQL in the BigQuery Export. This is because there is no corresponding column to the orgUnitID parameter.

Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue