Pieprasītā lapa pašlaik nav pieejama jūsu valodā. Varat lapas apakšdaļā atlasīt citu valodu vai nekavējoties tulkot jebkuru tīmekļa lapu jūsu izvēlētā valodā, izmantojot pārlūkā Google Chrome iebūvēto tulkošanas funkciju.

Start a new bulk data export

Best practices

  • Be aware of data growth. Data will be accumulated forever for your project, unless you set an expiration time for your data. Set appropriate partition expiration times to manage your storage costs.
  • Data is subject to Google Cloud storage and query costs, but there is a free usage level.

Intro to Search Console bulk data export - Google Search Console Training

Configure and start an ongoing bulk data export

Prerequisites

  • You must set up a Google Cloud project with billing and enable BigQuery, as described in the setup page. There is a free usage level, but storage and query usage above the free quota will be charged.

In Google Cloud Console

  1. Open your Google Cloud Console
  2. Switch to the Google Cloud project you'd like to export data to.
  3. Enable BigQuery in your project:
    1. Navigate in the sidebar to APIs & Services > Enabled APIs & Services
    2. If BigQuery is not enabled, click + ENABLE APIS AND SERVICES and enable BigQuery API and BigQuery Storage API.
  4. Grant permission to Search Console to dump data to your project:
    1. Navigate in the sidebar to IAM and Admin. The page should say Permissions for project <your_project>.
    2. Click + GRANT ACCESS to open a side panel that says Add principals.
    3. In New Principals, paste the following service account name:
      • search-console-data-export@system.gserviceaccount.com
    4. Grant it two roles: BigQuery Job User (bigquery.jobUser in the command-line interface) and  BigQuery Data Editor (bigquery.dataEditor in the command-line interface).
    5. Click Save.

In Search Console

  1. Set up your Google Cloud project as described above.
  2. Go to Settings > Bulk data export for your property.
  3. Copy the project ID (not the project number) for your Google Cloud Console project into the Cloud project ID field. (The project ID is visible in the project settings page.)
  4. Choose a dataset name. By default the name will be searchconsole, but if you'd like to export from multiple properties to one project ID, you'd need to give different dataset names for each Search Console property. The dataset name always starts with the string searchconsole, even when you customize it.
  5. Select a location for your dataset from the list. Search Console will create your dataset in this location with the first export. Note that you can't easily change this location later once your exports have begun.
  6. Click Continue to confirm your choices and initiate the regularly scheduled exports. If there is an immediately detectable problem (such as access) you should be informed relatively quickly. Otherwise, Search Console should begin the export process within a day.
  7. The first export will happen up to 48 hours after your successful configuration in Search Console. The first export includes data for the day of the export. If Search Console encounters a non-persistent error, it will retry exporting the next day as scheduled.
  8. Once the tables are created, you may set a partition expiration, but do not alter the schema (e.g. add a column). If you change the schema the export will fail.

If you want to see historical data that precedes your initial setup, use the Search Console API or the reports.

Next steps

Vai tas bija noderīgs?

Kā varam to uzlabot?

Vai nepieciešama papildu palīdzība?

Izmēģiniet norādītās nākamās darbības.

Search
Clear search
Close search
Main menu
17041626154088482409
true
Meklēšanas palīdzības centrs
true
true
true
true
true
83844
false
false