About Campaign Experiments
AdWords Campaign Experiments allow you to test changes to your account on a portion of the auctions that your ads participate in. Like putting a bit of hot sauce on just part of your meal, Campaign Experiments give you a taste of the results so you can figure out whether you want to pour on the heat! This way, you can test changes to your keywords, bids, ad groups, and placements.
The results from experiments can help you make better decisions and help you increase your return on investment.
Here we’ll explain:
- How Campaign Experiments work
- Common goals and elements of experiments
- Campaign Experiments and bidding
- Using AdWords Editor with Campaign Experiments
- Campaign Experiment success stories
How Campaign Experiments work
When you create an experiment, you decide what sort of change you want to test. For example, you could test adding new keywords, raising a bid, trying new ads, or using different placements. Then, you decide what percentage of your auctions should have this experimental change.
Keep in mind that AdWords Campaign Experiments are random-auction, meaning that every time a user conducts a search on Google.com or on a search partner website, or a new user loads a webpage on our Content partners, we'll randomly decide to make either your control or experimental split active for the auction (based on the percentage you set within experiment settings).
After the experiment has been running for a short while, you can view the results in the same table you use to view performance for your campaigns and ads. These tables will also tell you if your experimental changes are performing significantly better or worse than the ads without changes.
At any point, you can choose to end the experiment, cancel the experimental changes, or enable the experimental changes for all relevant ad auctions.
Let's say you're advertising in your city for hot sauce, and you're wondering if you should increase your bids to get more traffic. You'd like to see how such a change would affect your auctions, but a spicy food festival is scheduled to start in your city just two days from now.
If you simply raise your bid and you see that your clicks and impressions increase substantially, you won't know whether that increase came from the changes you made to your bid, or if the increase occurred because there are more people who love habaneros in town.
However, you can set up a campaign experiment to simultaneously use two different bids on the same keywords -- a portion of your auctions will use one bid and the rest will use another bid. This means that when you look at the performance of these two bid sets, the only significant difference will be the bid amount. You'll then be able to tell if the increased traffic was the result of your higher bid or just an unrelated upswing in interest in hot sauces.
Costs of Campaign Experiments
While Campaign Experiments don't cost anything to enable, experiments are treated as changes to your account and will be billed like any other campaign. If you raise your bid, for example, you'll need to pay the costs associated with using that increased bid for whatever portion of traffic it affects.
Campaign Experiments and Quality Score
Campaign Experiments influence your Quality Score for any keywords involved in your experiments.
Running an experiment might negatively impact your Quality Score in the short-term because you might test ads or bids that perform worse than your current ads or bids. However, in the long term, running an experiment and finding high quality ads or a better bid should raise your quality score, making up for this short-term drop in performance.
Common goals and elements of Campaign Experiments
While your experiment goal will depend on your business, some common goals for advertisers include:
- Increasing conversions
- Increasing clicks or impressions
- Improving return on investment
- Improving campaign quality
- Improving ad text
To experiment with these goals, here are some things you can test:If using the Google Search Network
Here are the elements you can test:
- New keywords
- New ad text
- New ad groups
- Negative keywords at the ad group level
- Most keyword match types
- Ad group default bids, including max CPC
- Keyword insertion
Here are the elements you can test:
- Bids on managed placements
- Additional placements
- Additional keywords for contextually-targeted ad groups
- New text ads or display ads
- New ad groups
- Ad group default bids, including max. CPC and max. CPM
- Remarketing options
- Site exclusions
To test some of these elements, you might first need to copy an ad group. Learn how to copy and paste ad groups and ads.
Any campaign settings you choose for your campaign while running an experiment will apply to the entire campaign, not just your experiment or control group. This means you essentially can't test:
- Targeting of any kind, including geographic targeting, language targeting, network targeting, and device targeting
- Bidding features
- Daily budget
- Ad extensions
- Ad scheduling
- Frequency capping
- Negative keywords at the campaign level
In addition, you can't set up experiments with automatic bidding or enhanced CPC because these features work on the campaign level. You'll need to disable these features to run an experiment, and you won't be able to turn them on for any campaigns that are already running an experiment.
Campaign Experiments and biddingCampaign Experiments and bid management tools
Not all third-party bid management tools can be used to work with Campaign Experiments. Let's say you use bid management tools that aren't compatible with Campaign Experiments. You change your bid on a keyword that has an experimental bid and is active in both control groups and experiment groups. The result is that the bid change will be applied to the control group, and the experimental group will be changed a certain percentage that you previously set.
If you make changes to a keyword that has no experimental bid and is active in both the control and experiment groups, the bids will update normally. Similarly, if you make changes to a keyword that's active only in either the control or experiment group, the bid will update normally.
If your bid management software hasn't yet fully integrated with AdWords Campaign Experiments, we suggest contacting your software provider for more information.
Advanced ad scheduling bid multiplier is applied at the campaign level, so this multiplier is applied first. If you've made any experimental bid changes in your campaign experiment, those will be applied second.
Let's say your ad group level default bid is $1.00. On Tuesdays, you have advanced ad scheduling applying a bid multiplier of 50 percent, so all day Tuesday, your bids are $1.50. You then turn on an experiment for this campaign with an experimental bid of +10 percent. On Monday, and Wednesday through Sunday, your bids will be $1.00 in the control group and $1.10 in the experiment group. However, on Tuesdays your bids will be $1.50 in the control group and $1.65 (+10 percent of $1.50) in the experiment group.
Campaign Experiments and AdWords Editor
You can manage some aspects of Campaign Experiments using AdWords Editor.What you can do using AdWords Editor
- Download existing experiments
- Change experimental bids
- Apply and edit an experiment status (e.g. "control only", "experiment only", "control and experiment") at the ad group, ad, or keyword level
- Apply and edit a default max. CPC, Display Network max. CPC, or max. CPM bid multiplier at the ad group level
- Apply and edit a max. CPC bid multiplier at the keyword level
- Change the maximum CPC bid and destination or final URL, add new keywords, change keyword text, and change keyword match type
- Download and upload experiment status and bid multipliers in CSV and XML
- Create, pause, or remove a campaign experiment
- Display segmented statistics for experiment and control groups
Campaign Experiment success stories
Here are stories of 2 companies that used Campaign Experiments to improve their business goals:
Belnick Inc. (www.bizchair.com) tested their ad creatives and increased their conversion rate by 50 percent. They also reduced their cost-per-conversion by more than 50 percent.
See the case study
SEER Interactive ran a landing page experiment for WidsomTree® and saw a conversion boost of 400 percent.
See the case study