Run a video experiment on YouTube

Official checklist for creative experiments for Video campaigns.


Best Practices logo

Experimentation should be a critical part of any successful marketing strategy. Relying on proven results is how leading marketers stay agile in dynamic markets, craft more effective campaigns at scale and identify the true impact of their efforts on business results. 
 

Don’t just take our word for it. Advertisers that optimized their creative strategy with experiments saw:

  • a 30% lower median CPA from the better performing creative1
  • a 60% higher ad recall from the better performing creative2

Whether you want to understand different ads’ impact on brand metrics, conversions or CPAs, creative experiments for Video campaigns can quickly give you results and insights to improve your performance on YouTube.

This is our official checklist for video experiments. 

1. Set a clear hypothesis 

  • A hypothesis is a question that reveals the reason you’re running the experiment in the first place. It should be tied to your specific business goal. For example, “Which of these two video ads drives the higher conversion rate for my acquisition campaign: a two-minute tutorial or a 15-second direct-offer video?”

Why: A hypothesis will help you determine whether your experiment was successful or not. Don’t evaluate your experiment based on too many metrics, or you risk muddling the results. Ask yourself which ones are most relevant and measurable. Keep in mind that the metrics you choose should serve to validate (or invalidate) your hypothesis. 

2. Create an experiment 

  • After you set a hypothesis, you’re ready to set up an experiment. Remember to only test one variable at a time. 

Why: Testing more than one variable at a time makes it impossible to identify which element drove the better outcome. For example, you might be trying to compare two creatives, but if you also use two different bidding strategies, you won’t know whether it was the creative or the bidding strategy that contributed to the winning experiment. So remember to keep it simple with two creative variations in a creative experiment and keep everything else (bids, budget, etc.) consistent. 

  • Limit your experiment to two arms when possible. 

Why: To ensure your campaign has enough reach, we recommend that most experiments have two arms, with an even distribution of traffic in each arm. No more than four arms per video experiment are permitted. That’s because the more your audience is divided into different experiment arms, the less reach each arm can achieve, making it more difficult to bring in statistically significant results. 

Get Started: Follow these steps to set up your video experiment.

3. Let it be 

  • Don’t make changes to targeting, bids or creative while your test is running.

Why: When you make changes to a campaign during an experiment, it becomes difficult to understand which of those changes impacted your original experiment. You won’t need to wait long for results: in most cases, you’ll be able to get your results in about a week. However, some experiments need up to 4 weeks, depending on your budget and the metrics you’re measuring. For example, if you’re measuring an experiment based on conversions, you’ll need to hit at least 100 conversions per experiment arm before we’ll be able to report on significant results. 

4. Take action 

  • Once completed, you’ll need to proactively end your experiment. Review your experiment report card, which will tell you which experiment arm performed better and can help you make future creative decisions. 

Why: You can learn from all aspects of the experiment. If your results are inconclusive, this means that there’s no difference in performance between your experiment arms. If your results haven’t come in yet, you may need to keep your experiment running a bit longer. Results will be available once each arm has at least 100 conversions. Or, if you are using Brand Lift, you can check the study’s progress in the Lift measurement tab. 

  • Build a library of your learnings to help you keep improving through experimentation. 

Why: Keeping central visibility of all experiments — past, ongoing, and upcoming — ensures that everyone is able to easily navigate and refer to a single source of truth. Plus, standardizing insights from past experiments will help you tap into them for your next campaign, and track and benchmark the value of your efforts. 

 

1. Source: Google Data, Global, 2019–2020. Successful video experiments were those with a significant difference in Brand Lift between experiment arms.
2. Source: Google Data, Global, 2019–2020. Successful video experiments were those with a significant difference in Brand Lift between experiment arms.

 

 

Sign up for the Best Practices newsletter to get advanced Google Ads tips and updates right to your inbox.
Was this helpful?
How can we improve it?

Need more help?

Sign in for additional support options to quickly solve your issue

true
Get the most out of your ad budget.

A step-by-step guide for leading brands and agencies to achieve success.Learn More

Search
Clear search
Close search
Google apps
Main menu
Search Help Center
true
73067
false