About the 'Experiments' page (formerly drafts and experiments)

Google Ads experiments can help you continuously improve the performance of your campaigns. When you test different campaign settings, you reach more customers and drive better results quickly and efficiently for your business.

A hero image demonstrating the process of campaign experiments.

On this page


How it works

You can create and run an experiment on your campaign to test the impact of your proposed changes. If you split the budget equally between the original campaign and experiment, you can easily compare the results over a specified time period. If your experiment produces better results at the end of that time period, you can apply the experiment to the original campaign or replace the original campaign. To set up an experiment, click Experiments in the left page menu of your Google Ads account.

Tip: You can select All experiments to open the 'All experiments' table. From there, you can complete any of these actions in Google Ads:
  • Manage your experiment statuses and options
  • Select a specific experiment type (ad variations, custom experiments or video experiments)
  • View your experiments across channel and experiment types (App, Search, Display and Video)
  • Change the way experiments are displayed by switching between 'Cards' or 'Table' views

This animation guides you through switching between card view and table view on the experiments page in Google Ads.


Ad variations

With ad variations, you can review the performance of your variations and apply the modified ads to your account. Ad variations are typically used to test text ads, responsive search ads or a single change across multiple campaigns. Ad variations are available for Search campaigns.

After you’ve set up an ad variation you can view the results and compare how the modified ads perform against your original ads. When you are happy with the ad variation experiment results, you can apply the modified ads to your campaign.


Custom experiments for Search and Display

Custom experiments are typically used to test Smart Bidding, keyword match types, landing pages, audiences and ad groups. Custom experiments are available for App campaigns and Search and display campaigns.

You can also create an experiment without a draft. This makes it easier to compare the performance of your base campaign and trial campaign.

After you’ve selected a base campaign to run an experiment with, you’ll set up the experiment and update the settings that you’d like to test. Google's system will create a new trial campaign for you with the new settings. After you've run the experiment and evaluated its performance, you can choose to apply the new settings back to the base campaign or run the experiment as a new independent campaign.


Video experiments

Video experiments are used to determine which of your video ads is more effective on YouTube. Video experiments are available for video campaigns.

You can:

  • Set up two to four different groups (known as experiment arms).
  • Choose the campaigns that you want to include in the experiment with a different video ad in each campaign.
  • Select a success metric ('Brand lift' or 'Conversions') to measure and compare the performance of the campaigns.

After you set up a video experiment, you can monitor its performance in Google Ads and find the best video ads in the experiment arms. When you understand which ad performed better in the experiment, you can make an informed decision on which campaign to continue and to allocate higher budgets to.


Performance Max experiments

Performance Max experiments are tools in Google Ads that help you to A/B test different features, settings and campaigns to improve results for your business. You can use experiments to help you measure the incremental uplift of using Performance Max campaigns.

You can:

  • Set up Uplift experiments for Performance Max
  • Set up Shopping campaign vs. Performance Max experiments

After you set up a Performance Max experiment, you can monitor its performance in Google Ads and find the best Performance Max ads in the experiment arms. When you understand which ad performed better in the experiment, you can make an informed decision on which campaign to continue and to allocate higher budgets to.

Note: The Performance Max campaign is automatically launched if the experiment has favourable results, after the experiment ends. You can disable this feature from the experiment 'Report' page.

Understanding the results of your experiment

You can now use information in the experiments table to understand the results of your experiment and take appropriate action. The 'Experiments' table contains the following columns:

  • Name: This shows the name of your experiment. You can click on your experiment name to learn more about it if you want to discover more than what’s available in the table.
  • Type: This shows the type of experiment you’re currently performing (for example, Uplift from Performance Max, Custom display, Video and many others ).
  • Status: This shows the current stage of your experiment (such as 'In progress', 'Complete (Applied)', and 'Scheduled').
  • Results: This shows which arm of the campaign performed best during the experiment duration
    • Control campaign: It indicates that the control arm performed better than the treatment arm in the experiment.
    • Treatment campaign: This means that the treatment arm performed better than the control arm in the experiment.
    • No clear winner or in progress – This means that either the winner can’t be determined or there’s not enough data yet. We recommend allowing your experiment to run for two to three weeks to gather enough data. If results are still undecided, you may need to increase your budget or allow it to run for longer to get enough data to generate a clear winner.
  • Actions: Here, you can view the recommended action for your experiment (for example, 'Apply').
  • Start date: It shows the start date of your experiment.
  • End date: It shows the end date of your experiment.
  • Metrics: Depending on the goals, experiment type, metrics selected during experiment creation, you can view several metrics in the table, such as Conversions or Conv. value). These represent the percentage of differential achieved by the treatment arm over the control campaign. By hovering over the text in this column, you can view additional information, including the confidence interval.
  • You can select additional metrics by pressing the Column icon A picture of the Google Ads columns icon, selecting metrics and clicking Save. You can also remove columns in the same way.

Viewing additional metrics

To customise the experiment table and add the metrics important to you, follow these steps:

  1. In your Google Ads account, click the Campaigns icon Campaigns Icon.
  2. Click the Campaigns drop-down in the section menu.
  3. Click Experiments.
  4. At the top right above the table, click on the columns icon A picture of the Google Ads columns icon.
  5. Click the tick box next to the metrics that you want to add to the table.
  6. Click Apply.

Take an action on your experiment

Within the experiments table, you’ll see a recommended action based on the outcome of your experiment:

  • Apply: Applying the experiment launches a campaign with the same settings as the treatment arm. You may also be able to adjust the setting of the original control campaign to mimic the treatment arm.
Note: If your experiment’s results can’t yet be determined (In Progress, Undecided or Unavailable show in the 'Results' column), we recommend allowing it to run for at least two to three weeks to collect enough data to make a determination. If no recommendation is available at that time, you may need to increase the budget or allow it to run for a longer duration in order to have enough data available.

Auto-apply favourable experiment results (only available for some types of experiments)

This feature, which is enabled by default, applies the experiment changes to the base campaign if it has favourable results, when the experiment ends. This allows you to benefit from the performance improvements of your experiments with little effort.

Note: You can disable it at any time during your experiment from the 'Report' page.

You can create an experiment using recommendation cards on the Experiments page. During creation, you can choose to enable this feature. After creating an experiment, a tooltip will be shown on the experiment summary card with the feature status. From this tooltip, you’ll also be able to turn this feature on or off, which will be reflected through your tooltip status.

Additionally, the status column on your 'Experiments' page may have one of the following states showing which experiments have been applied:

  • Complete (Not applied)
  • Complete (Applying…)
  • Complete (Applied)

When your experiment is complete, its tooltip state will update to let you know whether or not your changes were applied.

FAQs

1. How to know if an experiment is favourable and is directly applied?

  • If you’re using Max conversions with target cost per action (CPA) bidding, your experiment will be directly applied if conversions in your treatment arm are higher than your control arm, with CPA being lower.
  • If you’re using Max conversion value with target return on ad spend (ROAS) bidding, your experiment will be directly applied if the conversion value in your treatment arm is higher than your control arm, with ROAS being higher.
  • If you’re using Max conversions or Max conversion value bidding, your experiment will be directly applied if either the conversions or conversion value in your treatment arm are higher than your control arm.

2. When will the experiment changes be applied?

  • After your experiment reaches the end date, we’ll identify whether your experiment results were favourable, using the definition above. If we determine that it was favourable, we’ll apply the experiment changes to the control campaign.

3. Will an experiment be applied if the experiment ended manually?

  • No, we don't apply any experiments which were ended manually. We only apply experiments which end on the end date that you define.

4. Can I opt out from the auto-apply?

  • By default, this feature will be enabled when creating a new experiment, but you can choose to opt out before the experiment ends.

Related links

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
13021886692853974497
true
Search Help Centre
true
true
true
true
true
73067
false
false
false