আপনি যে পৃষ্ঠাটির জন্য অনুরোধ করেছেন সেটি বর্তমানে আপনার ভাষায় উপলভ্য নয়। আপনি পৃষ্ঠার নিচে অন্য কোনও ভাষা বেছে নিতে পারেন বা Google Chrome-এর বিল্ট-ইন অনুবাদ ফিচার ব্যবহার করে আপনার পছন্দের ভাষায় যেকোনও ওয়েবপৃষ্ঠা অবিলম্বে অনুবাদ করতে পারেন।

[GA4] Set up BigQuery Export

In this article:

Step 1: Create a Google-APIs-Console project and enable BigQuery

Note: You must be an Editor or above to create a Google-APIs-Console project and enable BigQuery.
  1. Log in to the Google Cloud Console.
  2. Create a new Google Cloud Console project or select an existing project.
  3. Navigate to the APIs table.

    Open the Navigation menu in the top-left corner, click APIs & Services, then click Library.
  4. Activate BigQuery.

    Under Google Cloud APIs, click BigQuery API. On the following page, click Enable.
  5. If prompted, review and agree to the Terms of Service.

Step 2: Prepare your project for BigQuery Export

You can export Google Analytics data to the BigQuery sandbox free of charge (sandbox limits apply).

Learn more about upgrading from the sandbox and BigQuery pricing.

Step 3: Link a Google Analytics 4 property to BigQuery

After you complete the first two steps, you can enable BigQuery Export from Analytics Admin.

BigQuery Export is subject to the same collection and configuration limits as Google Analytics. If you need higher limits, you can upgrade your property to 360.

  1. In Admin, under Product Links, click BigQuery Links.
    Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.
    • You must be an Editor or above at the property level to link an Analytics property to BigQuery.
    • You must also use an email address that has OWNER access to the BigQuery project (view Permissions below for detailed access requirements).
  2. Click Link.
  3. Click Choose a BigQuery project to display a list of projects for which you have access.

    If you have linked Analytics and Firebase (or plan to), consider exporting to the same Cloud project, which will facilitate easier joins with other Firebase data.
  4. Select a project from the list, then click Confirm.
  5. Select a location for the data. (If your project already has a dataset for the Analytics property, you can't configure this option.)
  6. Click Next.
  7. Select Configure data streams and events to select which data streams to include with the export and specific events to exclude from the export. You can exclude events by either clicking Add to select from a list of existing events or by clicking Specify event by name to choose existing events by name or to specify event names that have yet to be collected on the property.
  8. Click Done.
  9. Select Include advertising identifiers for mobile app streams if you want to include advertising identifiers.
  10. Select either or both a Daily (once a day) or Streaming (continuous) export of data. For Analytics 360 properties, you may also select Fresh Daily.
  11. Click Next.
  12. Review your settings, then click Submit.

Permissions

Project getIamPolicy/setIamPolicy rights, Services get/enable rights

OWNER is a super set of these permissions.

To create a BigQuery link the minimal permissions you need are:

  • resourcemanager.projects.get
    • To get the project
  • resourcemanager.projects.getIamPolicy
    • To get a list of permissions
  • resourcemanager.projects.setIamPolicy
    • To check if user has permission to create the link on this project
  • serviceusage.services.enable
    • To enable the BigQuery API
  • serviceusage.services.get
    • To check if the BigQuery API is enabled

Verify the service account

When you link Analytics and BigQuery, that process creates the following service account:

firebase-measurement@system.gserviceaccount.com

Verify that the account has been added as a member of the project, and given the role of BigQuery User (roles/bigquery.user).

If you previously set up BigQuery Export to give your service account the Editor role for the Google Cloud project, you can reduce that role to BigQuery User. To change the role for the service account, you need to unlink and then relink Analytics to your BigQuery project. The first step is to unlink Analytics and BigQuery and remove the service account with the Editor role. Then, relink Analytics and BigQuery per the instructions above to create the new service account with the correct permission for the project.

After relinking, ensure that the Service Account has the Owner (bigquery.dataOwner) role on the existing export dataset. You can do this by viewing access policy of the dataset.

Change regions

If you choose the wrong region and need to change it after you've created the link:

  1. Delete the link to BigQuery (see below).
  2. Backup the data to another dataset in BigQuery (move or copy).
  3. Delete the original dataset. Take note of the name: you'll need it in the next step.
  4. Create a new dataset with the same name as the dataset you just deleted, and select the location for the data.
  5. Share the new dataset with firebase-measurement@system.gserviceaccount.com and give the service account the BigQuery Data Owner role.
  6. Copy the backup data into the new dataset.
  7. Repeat the procedure above to create a new link to BigQuery.

After changing the location, you'll have a gap in your data: streaming and daily exports of data will not process between deletion of the existing link and creation of the new link.

Delete a link to BigQuery

  1. In Admin, under Product Links, click BigQuery Links.
    Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.You must be an Editor or above at the property level to delete a link to BigQuery.
  2. Click the row for the link.
  3. In the top right, click More > Delete.

BigQuery Export limits

Standard GA4 properties have a BigQuery Export limit of 1 million events for Daily (batch) exports. There is no limit on the number of events for Streaming export. If your property consistently exceeds the export limit, the daily BigQuery export will be paused and previous days’ exports will not be reprocessed.

For Analytics 360 properties, the Fresh Daily export contains all data fields and columns understood to be in the daily export, including observed user attribution and Ad Impression data. Learn more about the Fresh Daily export.

Property editors and administrators will receive an email notification each time a property they manage exceeds the daily limit. That notification will indicate when their export will be paused if action is not taken. Additionally, if a standard property significantly exceeds the one-million-event daily limit, Analytics may pause daily exports immediately. If you receive a notification, please leverage the data-filtering options (data-stream export and event-exclusion) to decrease the volume of events exported each day and ensure the daily export continues to operate.

Learn more about the higher limits available with 360 properties.

Data filtering

You can exclude specific data streams and events from your export, either to limit the size of your export or to make sure you're exporting just the events you want in BigQuery.

Exclude data streams and events during linking process

During the linking process, when you select the data streams you want to export, you also have the option to select events to exclude from export. See Step 9 in the linking process.

Add or remove data streams or events after you've configured linking

You can add or remove data streams and add events to or remove events from the exclusion list after you've configured the BigQuery link.

  1. In Admin, under Product Links, click BigQuery Links.
    Note: The previous link opens to the last Analytics property you accessed. You can change the property using the property selector.
    • You must be an Editor or above at the property level to add or remove data streams or events.
    • You must also use an email address that has OWNER access to the BigQuery project.
  2. Click the row for the project whose link you want to modify.
  3. Under Data streams and events, click View data streams and events.
  4. Under Data streams to export, you can select additional data streams to export or remove existing data streams from the list.
  5. On the Events to exclude list, click Add to select from a list of existing events or click Specify event by name to choose existing events by name or to specify event names that have yet to be collected on the property.
  6. To remove an event from the list, click the minus sign at the end of that row.

Pricing and billing

BigQuery charges for usage with two pricing components: storage and query processing. You can review the pricing table and learn about the differences between interactive and batch queries.

You need to have a valid form of payment on file in Cloud in order for the export to proceed. If the export is interrupted due to an invalid payment method, we are not able to re-export data for that time.

You can also export Analytics data to the BigQuery sandbox free of charge but keep in mind that sandbox limits apply.

When you start seeing data

Once the linkage is complete, data should start flowing to your BigQuery project within 24 hours. If you enable daily export, then 1 file will be exported each day that contains the previous day’s data (generally, during early afternoon in the time zone you set for reporting).

Reasons for linking failures

Creating the link to BigQuery can fail for either of the following two reasons:

  • Your organization policy prohibits export to the United States. If you've chosen the United States as the location of your data, choose a different location.
  • Your organization policy prohibits service accounts from the domain you want to export data from. In this case, you would need to modify your organization policy.

Reasons for export failures

Failure Cause Result
No service account No service account on your Cloud project with the role of Active user. Analytics cannot create tables. Export fails.
Robot account is deleted after installation A user on the Cloud account removed the robot service account installed by Google Analytics. Analytics is no longer able to create tables. All exports stop.
Organization Policy conflicts with BigQuery Export A user on the Cloud project created an organization policy that prevents Analytics from exporting data. The policy could prevent the creation of BigQuery tables or writing to tables. The policy could also object to the region of data storage. Table is either not created or is created and then rapidly (~30min) deleted.
User changes Billing Settings A user on the Cloud project switches from free to paid for BigQuery. While this would normally work, failures can occur, for example, if the project is already over 10GB (sandbox limit). In practice, the export can start failing. Tables don't populate.
Cloud project over quota Cloud has finite resources for most projects. You can exceed the BigQuery storage quota and then be prevented from writing more data. Note that this quota is small for free projects (10GB). Tables don't populate.
User changes property timezone The export takes a 24-hour snapshot of a property based on the property timezone. If the timezone changes, the export window can shorten or lengthen on a particular day (e.g., 3 hours shorter if the timezone is changed from US Eastern Standard Time to US Pacific Time). In either case, the user will see an unusual event count. One day of unusual event count. General user confusion.

Support

For BigQuery issues, like billing, contact Google Cloud Support.

BigQuery Export

For information about the export and access to a sample data set, read the BigQuery Export documentation.

BI-vendor integration with BigQuery

This list is not exhaustive, and may be updated as different integrations become available.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
15040519495320238035
true
Search Help Center
true
true
true
true
true
69256
true
false