Monitor the performance of a video experiment

After you set up a video experiment, you can monitor its performance in Google Ads and find the best performing video ads in the experiment arms. By understanding which ad is performing better in the experiment, you can make an informed decision on which campaign to continue using and allocating higher budgets to.

This article explains how to monitor and understand the performance of a video experiment.

Instructions

  1. Sign in to your Google Ads account.
  2. In the page menu on the left, expand Drafts & experiments and click Video experiments.
    • If you don't see Drafts & experiments in the page menu, click More to expand the available options in the menu.
  3. Select an experiment to view its results.

    UI dashboard for monitoring creative experiments  for video campaigns

Interpreting your results

  • When evaluating results, focus on the overall Brand Lift or conversions, as these metrics help you find the significant changes in performance between experiment arms.
  • To generate statistically significant results for conversions, you need at least 100 conversions per experiment arm. If any campaigns included in the experiment haven’t received at least a 100 conversions, you won’t see any results in Google Ads.
  • If the budget is split evenly between experiment arms, but one campaign receives more impressions than the other, this means that the campaigns are entering different auctions and winning at different bids. The campaign that wins more auctions at a lower cost will have the most impressions. An experiment only ensures that the users in one experiment arm don’t overlap with users in another experiment arm.

Best practices

  • Take action on your results: If you find statistically significant results in an experiment arm, you can maximize the impact by pausing other experiment arms and shifting all the budget to the experiment arm with the more significant results.
  • Build on past learning: For example, if you find out that customized video assets for different audience segments perform better than showing the same generic asset to all the audiences, then use this to inform the development of future video assets.
  • Inconclusive results can also be insightful: For example, you may have 2 creatives that perform equally well in the experiment, but one of the creatives may be cheaper to produce than the other.
Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue

Search
Clear search
Close search
Google apps
Main menu
Search Help Center
true
73067
false