Set up Gmail logs in BigQuery

Supported editions for this feature: Enterprise; Education Standard and Education Plus.  Compare your edition

Gmail logs store records for each stage of a message in the Gmail delivery process. To analyze Gmail flow through the delivery process, assign Gmail logs to a dataset in a BigQuery project. After the Gmail logs are assigned, you can review reports.   

Note: Email logs created before you set up Email Logs in BigQuery can't be exported to BigQuery.

Assign Gmail logs to a BigQuery dataset

  1. Sign in to your Google Admin console.

    Sign in using your administrator account (does not end in

  2. From the Admin console home page, go to Appsand thenGoogle Workspaceand thenSettings for Gmail​and thenSetupand then 

    Email Logs in BigQuery.

  3. Click Configure.
  4. In the Add setting window, enter a description.
  5. Select the BigQuery project you want to use for Gmail logs. Select a project with write access. 
  6. Enter a dataset name where Gmail logs are stored. 
  7. Click Add setting to return to the settings page, then click Save.

    Note: If an error occurs, try clicking Add setting again. You might need to go to the BigQuery console and remove the previously created dataset.

  8. After saving settings, go back to your BigQuery project. A dataset with this information is now in the project: 
    • The standard roles: project owners, project editors, and project viewers
    • Four service accounts that are designated dataset editors: Writes the logs. Writes the logs. Automatically restores the template table if it's accidentally removed. Updates the schema in the future.

      Note: Do not remove these service accounts or change their roles. These are required accounts.
  9. To verify these service accounts are added, point to the new dataset and click Down next to the dataset name.
  10. Click Share dataset. Daily email logs are now exported to BigQuery. It can take up to 24 hours for your changes to take effect.

daily_ table

After you turn on email logs in BigQuery, a new table named daily_ is added to the dataset. This table is a template that provides the schema for the daily tables. After you create the daily_ template, daily tables are automatically created in your dataset. The logs are then available for use. 

What you should know about the daily_ table:

  • It's always empty and never expires. 
  • Don't remove, modify, rename, or add data to this table.
  • It's a date-partitioned table. Actual data is written to a table named daily_YYYYMMDD, based on the GMT time when an event occurs.

Gmail log queries

Example queries

Try some example queries for Gmail logs in BigQuery. The examples are common use cases for Gmail logs.

Custom Queries

Compose your own, custom queries using the schema for Gmail logs in BigQuery.

SQL dialects for queries

BigQuery supports two SQL dialects for queries

Data might be truncated for some fields

It’s important to note that BigQuery has a maximum row size limit of 1MB. For this reason, some fields are truncated to make the log shorter than 1MB - 1KB, so that it can be inserted successfully into BigQuery. The 1KB is intentionally left as a buffer.

The following fields might be truncated if the log is too long, or the number of triggered rules (triggered_rule_info) in the log is too big:


For more information, see Schema for Gmail logs in BigQuery.

Related information

Streaming inserts

Sandbox expiration

The expiration time for these BigQuery sandbox objects is 60 days:

  • Tables
  • Partitions
  • Partitions in partitioned tables
  • Views

You can change the default table expiration time for tables.

If a table expires or is removed, it can be restored within 2 days.

Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue

Clear search
Close search
Google apps
Main menu
Search Help Center