Monitor the performance of a video experiment

After you set up a video experiment, you can monitor its performance in Google Ads and find the best performing video ads in the experiment arms. By understanding which ad is performing better in the experiment, you can make an informed decision on which campaign to continue using and allocating higher budgets to.

This article explains how to monitor and understand the performance of a video experiment.

Instructions

Note: The instructions below are part of the new design for the Google Ads user experience. To use the previous design, click the 'Appearance' icon and select Use previous design. If you're using the previous version of Google Ads, review the Quick reference map or use the search bar in the top navigation panel of Google Ads to find the page you’re searching for.
  1. In your Google Ads account, click the Campaigns icon Campaigns Icon.
  2. Click the Campaigns drop-down in the section menu.
  3. Click Experiments, then click Video experiments.
  4. Select an experiment to view its results.

    UI dashboard for monitoring creative experiments for video campaigns

Interpreting your results

  • When evaluating results, focus on the overall Brand Lift or conversions, as these metrics help you find the significant changes in performance between experiment arms.
  • To generate statistically significant results for conversions, you need at least 100 conversions per experiment arm. If any campaigns included in the experiment haven’t received at least a 100 conversions, you won’t see any results in Google Ads.
  • If the budget is split evenly between experiment arms, but one campaign receives more impressions than the other, this means that the campaigns are entering different auctions and winning at different bids. The campaign that wins more auctions at a lower cost will have the most impressions. An experiment only ensures that the users in one experiment arm don’t overlap with users in another experiment arm.

Best practices

  • Take action on your results: If you find statistically significant results in an experiment arm, you can maximise the impact by pausing other experiment arms and shifting all the budget to the experiment arm with the more significant results.
  • Build on past learning: For example, if you find out that customised video assets for different audience segments perform better than showing the same generic asset to all the audiences, then use this to inform the development of future video assets.
  • Inconclusive results can also be insightful: For example, you may have two creatives that perform equally well in the experiment, but one of the creatives may be cheaper to produce than the other.

Was this helpful?

How can we improve it?
true
Achieve your advertising goals today!

Attend our Performance Max Masterclass, a livestream workshop session bringing together industry and Google ads PMax experts.

Register now

Search
Clear search
Close search
Main menu
16363459997771549290
true
Search Help Centre
true
true
true
true
true
73067
false
false
false