About custom experiments

Custom experiments let you propose and test changes to your Search and Display campaigns. You can measure your results and understand the impact of your changes before you apply them to a campaign.

This article explains how custom experiments work. When you’re ready, read how to set up a custom experiment.

Note: If you use the Google Ads API, there will be no changes to the workflow. You will continue to use drafts and experiments.

How custom experiments work

Custom experiments let you create custom experiments campaign to test how your changes perform against your original campaign.

  1. Set up a custom experiment. When you set up your experiment, you can specify how long you’d like it to run and how much of your original campaign’s traffic (and budget) you’d like it to use.
  2. Monitor your custom experiments. As your experiment runs, you can monitor and compare its performance against your original campaign. If you’d like, you can change the dates of your experiment to end it early.
  3. Apply your custom experiment. If your experiment performs better than your original campaign, you can apply your experiment to the original campaign (manually or automatically). You also have the option to convert your experiment into a new campaign with the same dates and budget as your original campaign and pause your original campaign.

Bear in mind

Custom experiments are only available for Search and display campaigns. You won’t be able to create custom experiments for App or Shopping campaigns. App asset experiment is available for App campaigns. Learn how to Create a new App asset experiment.

Features that aren’t supported by custom experiments

Custom experiments generally support the same features as campaigns, with these exceptions:

  • Ad customisers that use 'Target campaign' or 'Target ad group'. However, you can create custom experiments and assign a target campaign or ad group feed after it’s been created. Learn how to Create ad customisers for responsive search ads.
  • Shared budgets
  • Bid landscapes

Benefits

Simplified workflow: You can select your experiment type and create custom experiments in fewer steps. You can choose:

  • Ad variations: Run text ad or responsive search ads on your campaigns or across your account
  • Custom: Create a search or Display experiment
  • Select the original campaign, modify test variables in the trial campaign and run the experiment.
  • Video: Test your video assets in your video campaigns

Easier reporting: You can report across all your experiments from one place in the Experiments page. You can:

  • View all experiments across ad variations in a single table, including search and Display experiments.
  • View all other metrics in a summary table that compares performance for the original and experiment campaign (no more hovering to compare stats!).
  • Identify up to 2 primary success metrics and get tailored reporting.
  • View clearer performance comparison dates with custom experiments-specific date picker.

Manage experiments: Navigate through campaign management tables for original and trial campaigns.

  • Navigate data across campaign, ad group or creative levels (including regular columns like change history, etc.) which helps in deep dives to examine variations in performance.
  • Edit or update experiment variables from the campaign management table.

If you want to test your search ad creative, we recommend using ad variations. To test video creatives, read Create a video experiment.

Example

Anthony and his boss consider changing bids for their campaign but want to be confident these changes will improve performance. Anthony creates a new experiment with target ROAS bid changes and runs it for a month. Results at the end of the month show that changing bids improved performance, and the experiment was automatically applied to the original campaign.

Related links

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
7547320200190734567
true
Search Help Centre
true
true
true
true
true
73067
false
false
false