Set up service log exports to BigQuery

Supported editions for this feature: Frontline Standard; Enterprise Standard and Enterprise Plus; Education Standard and Education Plus; Enterprise Essentials Plus.  Compare your edition

To export audit logs and usage reports to Google BigQuery, you need to set up a BigQuery Export configuration in the Google Admin console.

About BigQuery and Reports API data

There are differences between the data that’s available in the BigQuery dataset and the data retrieved from the Reports API. The BigQuery data only includes the unfiltered dataset. You can still filter the data using SQL, but not all Reports API parameters are supported.

You can filter the Reports API data by including parameters in the API request.

Example: There are two organizational units (OU) in a domain, A and B. Using Reports API and BigQuery, you can access all the events for the entire domain (A and ). However:

  • With the Reports API, you can retrieve the A events by using the orgUnitID parameter in the API request. 
  • With SQL and BigQuery, there is no way to filter events by organization unit because there isn’t a corresponding column to the orgUnitID parameter.

Important:

  • The BigQuery data includes historical data, which you can also retrieve from the Reports API.
  • If you turn off exporting Google Workspace data to BigQuery, no new data is included in the BigQuery Export. However, existing data is available in other sources, such as the Reports API.
  • Not all service report data is available in BigQuery Export. For a list of supported services, go to What services does BigQuery Export support? below.
  • For examples of queries, go to Example queries for reporting logs in BigQuery.

How data is propagated and retained

  • Policies can take an hour to propagate. After that, daily tables are automatically created in your dataset (Pacific Time).
  • Data is saved following guidelines for other logs and reports. For details, go to Data retention and lag times.
  • Data tables don’t automatically get deleted. To delete an active project, go to Delete a BigQuery Export configuration.
  • BigQuery Exports collect Google Workspace data from the previous day's events. The result shows data from the previous day to the export date.

Set up BigQuery Export configuration

You first need to set up a BigQuery project in the Google Cloud console. When you create the project:

  • Add a Google Workspace administrator account as the project editor.
  • Add the gapps-reports@system.gserviceaccount.com account as an editor. This is necessary to write logs, update the schema, and to complete step 5 below.

For instructions, go to Set up a BigQuery project for reporting logs.

  1. Sign in to your Google Admin console.

    Sign in using an account with super administrator privileges (does not end in @gmail.com).

  2. In the Admin console, go to Menu ""and then"" Reportingand thenBigQuery Export.
  3. Point to the BigQuery Export card and click Edit "".
  4. To activate BigQuery logs, check the Enable Google Workspace data export to Google BigQuery box.
  5. Under BigQuery project ID, select the project where you want to store the logs. Choose a project with write access. If you don’t see the project, you need to set it up in BigQuery. For details, go to Quickstart using the Google Cloud console.
  6. Under New dataset within project, enter the name of the dataset to use for storing the logs in the project. Dataset names must be unique for each project. For details, go to Creating datasets.
  7. Click Save.
    Note: If you can’t save the project, go to the Google Cloud console, delete the new dataset, then save it again in the Admin console.

When the export is triggered, the dataset is created the next day. In addition to project owners, editors, and viewers, the gapps-reports@system.gserviceaccount.com service account is added as editor. The service account is required to write logs and update the schema.

Change the BigQuery Export dataset and backfill to the new dataset 

You can change BigQuery Export settings to start exporting data to a different dataset than the current one. You can also move existing data from the previous dataset to the newly selected one (180 days for audit and 450 days for usage). For example, you can change your BigQuery export dataset location from US to EU. Backfill lets you transfer data already stored in the US region to the EU region. If you choose backfill, BigQuery creates a new dataset in the specified location. 

Considerations

  • The existing data is exported to a new dataset if you select a new location, change project ID, or change dataset ID. 
  • The exported data is also retained in the previous dataset. 
  • Any new data, along with the backfilled data, is stored in the new dataset.
  • The existing data (180 days for audit and 450 days for usage) is exported to the new dataset, but not deleted from the existing location, so you’ll have BigQuery Export datasets in 2 locations.

To backfill existing data to the new location:

  1. Sign in to your Google Admin console.

    Sign in using an account with super administrator privileges (does not end in @gmail.com).

  2. In the Admin console, go to Menu ""and then"" Reportingand thenBigQuery Export.
  3. Enter the new dataset name.
  4. If you would like to backfill existing data to the new location, check the Backfill existing BigQuery data from the last 180 days to new dataset box.
  5. Click Confirmand thenSave.

Audit log export requirements

Audit logs are exported through the insertAll API, which requires you to have billing enabled for your BigQuery export project. If billing is not enabled, your project will be in sandbox mode, and the audit logs aren’t exported to your dataset. For more details, go to Limitations.

Note: Usage reports export are still enabled for sandbox mode projects.

Lag times

In most cases, after you turn on data export to Google BigQuery, activity log events are available within 10 minutes. On initial configuration, usage log events have a delay of 48 hours, but afterwards, the usual lag is 1-3 days. For details, go to Data retention and lag times

FAQ

How do I set a data expiration for my exports?

By default, the expiration for data exports is set to 60 days. Therefore, any BigQuery data export that you perform is deleted from Google Cloud after 60 days.
To change the expiration time, go to Updating default table expiration times.

Can I change a BigQuery project ID? 

Yes, you can change the project ID for a BigQuery Export configuration at any time. Changes take effect the next day, when data is copied to the new BigQuery project.
Important: We don’t recommend changing the BigQuery project, because previous data is not copied into the new table. To access previous data, access the previous project. 

What services does BigQuery Export support?

The following log data is supported:
  • Accounts
  • Admin
  • Calendar
  • Chrome
  • Classroom
  • Currents
  • DataStudio
  • Devices
  • Drive
  • Gmail
  • Google Chat
  • Google Meet
  • Groups
  • Login
  • Rules
  • SAML
  • Token
The following usage reports are supported:
  • Accounts
  • App Maker
  • Apps Script
  • Calendar
  • Chrome OS
  • Classroom
  • Currents
  • Devices
  • Docs
  • Drive
  • Gmail
  • Google Search
  • Meet
  • Sites
  • Voice

Note: We plan to support more audit logs, including Search.

Is there a cost to export audit logs to BiqQuery?  

Yes, there is a cost to export audit logs to BigQuery. This is because Google uses the insertAll API to view logs in real time. For information, go to Data ingestion pricing.
 
There is no cost to export usage reports, such as Devices or Meet reports.

Related topic

  1. Sign in to your Google Admin console.

    Sign in using an account with super administrator privileges (does not end in @gmail.com).

  2. In the Admin console, go to Menu ""and then"" Reportingand thenBigQuery Export.
Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue

Search
Clear search
Close search
Google apps
Main menu
true
Search Help Center
true
true
true
true
true
73010