Set up BigQuery Export

 

In this article:

Step 1: Create a Google-APIs-Console project and enable BigQuery

 

  1. Log in to the Google APIs Console.
  2. Create a Google APIs Console project or select an existing project.
  3. Navigate to the APIs table.

    Open the Navigation menu in the top-left corner, click APIs & Services, then click Library.
  4. Activate BigQuery.

    Under Google Cloud APIs, click BigQuery API. On the following page, click Enable.
  5. Verify that you've added a service account to your Cloud project.

    Verify that firebase-measurement@system.gserviceaccount.com has been added as a member of the project, and given the primitive role of editor. Add the account if necessary.
  6. If prompted, review and agree to the Terms of Service.

Step 2: Prepare your project for BigQuery Export

You can export Google Analytics data to the BigQuery sandbox free of charge (Sandbox limits apply).

Learn more about upgrading from the Sandbox and BigQuery pricing.

Step 3: Link BigQuery to a Google Analytics 4 property

After you complete the first two steps, you can enable BigQuery Export from Analytics Admin.

  1. Sign in to Google Analytics. Use an email address that has OWNER access to the BigQuery project, and also has Edit permission for the Analytics property that includes the data stream you want to link.
  2. Click Admin, and navigate to the Analytics property that you want to link.
  3. In the PROPERTY column, click BigQuery Linking.
  4. Click Link.
  5. Click Choose a BigQuery project to display a list of projects for which you have at least read permission.

    If you have linked Analytics and Firebase (or plan to), consider exporting to the same Cloud project, which will facilitate easier joins with other Firebase data.
  6. Select a project from the list, then click Confirm.
  7. Select a location for the data. (If your project already has a dataset for the Analytics property, you can't configure this option.)
    If you choose wrong region and need to change it after you've created the link, you have to move the dataset in Cloud and then create a new link, or delete the link and dataset and start over. With either method, you'll have a gap in your data: streaming and daily exports of data will not process between deletion of the existing link and creation of the new link.
  8. Click Next.
  9. Select the data streams whose data you want to export.
    Select Include advertising identifiers for mobile app streams if you want to include advertising identifiers.
  10. Select either or both a Daily (once a day) or Streaming (continuous) export of data.
  11. Click Next.
  12. Review your settings, then click Submit.

Pricing and billing

BigQuery charges for usage with two pricing components: storage and query processing. You can review the pricing table and learn about the differences between interactive and batch queries.

You need to have a valid form of payment on file in Cloud in order for the export to proceed. If the export is interrupted due to an invalid payment method, we are not able to re-export data for that time.

You can also export Analytics data to the BigQuery sandbox free of charge but keep in mind that Sandbox limits apply.

When you start seeing data

Once the linkage is complete, data should start flowing to your BigQuery project within 24 hours. If you enable daily export, then 1 file will be exported each day that contains the previous day’s data (generally, during early afternoon in the time zone you set for reporting).

Reasons for export failures

Failure Cause Result
No user with Editor role You didn't add a service account to your Cloud project with the role of Editor. Analytics cannot create tables. Export fails.
Robot account is deleted after installation A user on the Cloud account removed the robot service account installed by Google Analytics. Analytics is no longer able to create tables. All exports stop.
Organization Policy conflicts with BigQuery Export A user on the Cloud project created an organization policy that prevents Analytics from exporting data. The policy could prevent the creation of BigQuery tables or writing to tables. The policy could also object to the region of data storage. Table is either not created or is created and then rapidly (~30min) deleted.
User changes Billing Settings A user on the Cloud project switches from free to paid for BigQuery. While this would normally work, failures can occur, for example, if the project is already over 10GB (sandbox limit). In practice, the export can start failing. Tables don't populate.
Cloud project over quota Cloud has finite resources for most projects. You can exceed the BigQuery storage quota and then be prevented from writing more data. Note that this quota is small for free projects (10GB). Tables don't populate.
User changes property timezone The export takes a 24-hour snapshot of a property based on the property timezone. If the timezone changes, the export window can shorten or lengthen on a particular day (e.g., 3 hours shorter if the timezone is changed from US Eastern Standard Time to US Pacific Time). In either case, the user will see an unusual event count. One day of unusual event count. General user confusion.

Support

For BigQuery issues, like billing, contact Google Cloud Support.

BigQuery Export

 

For information about the export and access to a sample data set, read the BigQuery Export documentation.

BI-vendor integration with BigQuery

This list is not exhaustive, and may be updated as different integrations become available.
Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue