Set up Gmail logs in BigQuery

This feature is only available with G Suite Enterprise and G Suite Enterprise for Education. Only users with one of these licenses can use this feature. You can assign licenses to individual users, to everyone in an organizational unit, or to everyone in your G Suite account. Learn more about how licensing works.

Before you start

Before you set up Gmail logs with BigQuery, complete these steps in Google Cloud:

  1. Find an active BigQuery project to use, or create a BigQuery project.

    Note: If you don't have a BigQuery account, set up a free trial to try BigQuery.

  2. Go to the IAM & Admin page for your project. 
  3. Add a project editor for your project. The project editor sets up the BigQuery project in the G Suite Admin console.
    • At the top of the page, click ADD.
    • In New members, enter the project editor's user ID.
    • In Select a Role, select Project, then Editor.
    • Optionally, click ADD ANOTHER ROLE to add the same person as the project owner: Select Project, then Owner.
    • Click Save.

Note: VPC service controls must be turned off in the BigQuery project for Gmail to export data to BigQuery. Off is the default setting. 

Assign Gmail logs to a BigQuery dataset

Gmail logs store records for each stage of an message in the Gmail delivery process.

To analyze Gmail flow through the delivery process, assign Gmail logs to a dataset in a BigQuery project. After the Gmail logs are assigned, you can review reports.   

Note: Email logs created before you set up Email Logs in BigQuery can't be exported to BigQuery.

  1. Sign in to your Google Admin console.

    Sign in using your administrator account (does not end in @gmail.com).

  2. From the Admin console Home page, go to Appsand thenG Suiteand thenGmailand thenAdvanced settings.

    Tip: To see Advanced settings, scroll to the bottom of the Gmail page.

  3. Under General Settings and then Setup, go to Email Logs in BigQuery and click Configure.
  4. In the Add setting window, enter a description.
  5. Select the BigQuery project you want to use for Gmail logs. Select a project with write access. 
  6. Enter a dataset name where Gmail logs are stored. 
  7. Click Add setting to return to the settings page, then click Save.

    Note: If an error occurs, try clicking Add setting again. You might need to go to the BigQuery console and remove the previously created dataset.

  8. After saving settings, go back to your BigQuery project. A dataset with this information is now in the project: 
    • The standard roles: project owners, project editors, and project viewers
    • Four service accounts that are designated dataset editors: 
      gmail-for-work-logs-writer1@system.gserviceaccount.com: Writes the logs. 
      gmail-for-work-logs-writer2@system.gserviceaccount.com: Writes the logs.
      gmail-for-work-logs-recoverer@system.gserviceaccount.com: Automatically restores the template table if it's accidentally removed. 
      gmail-for-work-logs-schema-updater@system.gserviceaccount.com: Updates the schema in the future.

      Note: Do not remove these service accounts or change their roles. These are required accounts.
  9. To verify these service accounts are added, point to the new dataset and click Down next to the dataset name.
  10. Click Share dataset. Daily email logs are now exported to BigQuery. It can take up to an hour for your changes to take effect.

daily_ table

After you turn on email logs in BigQuery, a new table named daily_ is added to the dataset. This table is a template that provides the schema for the daily tables. After you create the daily_ template, daily tables are automatically created in your dataset. The logs are then available for use. 

What you should know about the daily_ table:

  • It's always empty and never expires. 
  • Don't remove, modify, rename, or add data to this table.
  • Use GMT time to determine the date change line. 

Gmail log queries

Example queries

Try some example queries for Gmail logs in BigQuery. The examples are common use cases for Gmail logs.

Custom Queries

Compose your own, custom queries using the schema for Gmail logs in BigQuery.

SQL dialects for queries

BigQuery supports two SQL dialects for queries

Sandbox expiration

The expiration time for these BigQuery sandbox objects is 60 days:

  • Tables
  • Partitions
  • Partitions in partitioned tables
  • Views

You can change the default table expiration time for tables.

If a table expires or is removed, it can be restored within 2 days.

Was this helpful?
How can we improve it?