Google Ads experiments can help you continuously improve the performance of your campaigns. When you test different campaign settings, you reach more customers and drive better results quickly and efficiently for your business.
On this page
- How it works
You can create and run an experiment on your campaign to test the impact of your proposed changes. If you split the budget equally between the original campaign and experiment, you can easily compare the results over a specified time period. If your experiment produces better results at the end of that time period, you can apply the experiment to the original campaign or replace the original campaign. To set up an experiment, click Experiments in the left page menu of your Google Ads account.
- Manage your experiment statuses and options
- Select a specific experiment type (ad variations, custom experiments, or video experiments)
- View your experiments across channel and experiment types (App, Search, Display, and Video)
With ad variations, you can review the performance of your variations and apply the modified ads to your account. Ad variations are typically used to test text ads, responsive search ads, or a single change across multiple campaigns. Ad variations are available for Search campaigns.
After you’ve set up an ad variation you can view the results and compare how the modified ads perform against your original ads. When you are happy with the ad variation experiment results, you can apply the modified ads to your campaign.
Custom experiments are typically used to test Smart Bidding, keyword match types, landing pages, audiences, and ad groups. Custom experiments are available for App campaigns and Search and Display campaigns.
You can also create an experiment without a draft. This makes it easier to compare the performance of your base campaign and trial campaign.
After you’ve selected a base campaign to run an experiment with, you’ll set up the experiment and update the settings you’d like to test. Google's system will create a new trial campaign for you with the new settings. After you've run the experiment and evaluated its performance, you can choose to apply the new settings back to the base campaign or run the experiment as a new independent campaign.
Video experiments are used to determine which of your video ads is more effective on YouTube. Video experiments are available for Video and Discovery campaigns.
- Set up 2 to 4 different groups (known as experiment arms).
- Choose the campaigns to include in the experiment (with a different video ad in each campaign).
- Select a success metric (“Brand lift” or “Conversions”) to measure and compare the performance of the campaigns.
After you set up a video experiment, you can monitor its performance in Google Ads and find the best video ads in the experiment arms. When you understand which ad performed better in the experiment, you can make an informed decision on which campaign to continue and to allocate higher budgets to.
Performance Max experiments are tools in Google Ads that help you to A/B test different features, settings, and campaigns to improve results for your business. You can use experiments to help you measure the incremental uplift of using Performance Max campaigns.
- Set up Uplift experiments for Performance Max
- Set up shopping campaign vs. Performance Max experiments
After you set up a Performance Max experiment, you can monitor its performance in Google Ads and find the best Performance Max ads in the experiment arms. When you understand which ad performed better in the experiment, you can make an informed decision on which campaign to continue and to allocate higher budgets to.
You can now use information in the experiments table to understand the results of your experiment, and take appropriate action. The “Experiments” table contains the following columns:
- Name: This shows the name of your experiment. You can click on your experiment name to learn more about it if you want to discover more than what’s available in the table.
- Type: This shows the type of experiment you’re currently performing (for example, Uplift from Performance Max, Custom display, Video, and many others ).
- Status: This shows the current stage of your experiment (such as “In progress”, “Complete (Applied)”, and “Scheduled”).
- Results: This shows which arm of the campaign performed best during the experiment duration
- Control campaign: It indicates that the control arm performed better than the treatment arm in the experiment.
- Treatment campaign: This means that the treatment arm performed better than the control arm in the experiment.
- No clear winner or in progress - This means that either the winner can’t be determined or there’s not enough data yet. We recommend allowing your experiment to run for 2 to 3 weeks to gather enough data. If results are still undecided, you may need to increase your budget or allow it to run for longer to get enough data to generate a clear winner.
- Actions: Here, you can view the recommended action for your experiment (for example, “Apply”).
- Start date: It shows the start date of your experiment.
- End date: It shows the end date of your experiment.
- Metrics: Depending on the goals, experiment type, metrics selected during experiment creation, you can view several metrics in the table, such as Conversions or Conv. value). These represent the percentage of differential achieved by the treatment arm over the control campaign. By hovering over the text in this column, you can view additional information, including the confidence interval.
- You can select additional metrics by pressing the Column icon , selecting metrics, and clicking Save. You can also remove columns in the same way.
To customize the experiment table and add the metrics important to you, follow these steps:
- In your Google Ads account, click the Campaigns icon .
- Click the Campaigns drop-down in the section menu.
- Click Experiments.
- At the top right above the table, click on the columns icon .
- Click the check-box next to the metrics that you want to add to the table.
- Click Apply.
Within the experiments table, you’ll see a recommended action based on the outcome of your experiment:
- Apply: Applying the experiment launches a campaign with the same settings as the treatment arm. You may also be able to adjust the setting of the original control campaign to mimic the treatment arm.
An opt-in experiment feature applies the experiment changes to the base campaign if it has favorable results, when the experiment ends. This allows you to benefit from the performance improvements of your experiments with little effort.
Note: This feature may be disabled or enabled at any time during your experiment.
You can create an experiment using recommendation cards on the Experiments page. During creation, you can choose to enable this feature. After creating an experiment, a tooltip will be shown on the experiment summary card with the feature status. From this tooltip, you’ll also be able to turn this feature on or off, which will be reflected through your tooltip status.
Additionally, the status column on your "Experiments" page may have one of the following states showing which experiments have been applied:
- Complete (Not applied)
- Complete (Applying…)
- Complete (Applied)
When your experiment is complete, its tooltip state will update to let you know whether or not your changes were applied.FAQs
1. How to know if an experiment is favorable and is directly applied?
- If you’re using Max conversions with target cost per action (CPA) bidding, your experiment will be directly applied if conversions in your treatment arm are higher than your control arm, with CPA being lower.
- If you’re using Max conversion value with target return on ad spend (ROAS) bidding, your experiment will be directly applied if the conversion value in your treatment arm is higher than your control arm, with ROAS being higher.
- If you’re using Max conversions or Max conversion value bidding, your experiment will be directly applied if either the conversions or conversion value in your treatment arm are higher than your control arm.
2. When are the experiment changes applied?
- After your experiment reaches the end date, we’ll identify whether your experiment results were favorable, using the definition above. If we determine that it was favorable, we’ll apply the experiment changes to the control campaign.
3. Will an experiment be applied if it ends manually?
- No, we don't apply any experiments which were ended manually. We only apply experiments which end on the end date that you define.