Create a multivariate test (MVT) [beta]
Multivariate testing (MVT) helps you understand the interactions between multiple sections of a page by testing them together in a coordinated way.In this article:
What is MVT?
A multivariate test (MVT) tests variants of two or more portions of a page simultaneously to see which combination creates the best outcome. Instead of showing which page variant is most effective (as in an A/B test), MVT identifies the most effective variant of each element and identifies the best combination of element variants.
For example, the multivariate test below is useful for identifying the best headline (H1 vs. H2) and hero image (A vs. B vs. C) combination to use on a landing page.
|Multivariate test||An experiment that tests two or more elements, or sections, to understand their effects on each other. For example, variants of a headline can be tested at the same time as variants of a hero image. Instead of showing which page variant is most effective (as in an A/B experiment), a multivariate test identifies the most effective combination of variants. Rather than the two or three page variants typically found in simple A/B tests, multivariate tests frequently test multiple variants of multiple page elements simultaneously.|
|Section||A single part of a web page (e.g. a headline or hero image) that is modified to create variants. An A/B test includes one section while a multivariate test includes many sections.|
|Combination||In a multivariate test with multiple sections, a combination is the experience created from each section's variants. For example, combination 1 (above) is comprised of headline 1 and image A.|
Definitions for these and other testing terms can be found in the Optimize glossary.
Create a multivariate test
To create a multivariate test:
- Go to your Optimize account (Main menu > Accounts).
- Click on a Container to get to the Experiments page.
- Click the + button and select Multivariate test.
- Enter an Experiment name.
- Enter an Editor page. This is the page you’ll use to create variants.
- Pick a Google Analytics view to use with the experiment. Learn more about linking views.
- Click CREATE.
The Google Analytics Property field is automatically populated from your linked Analytics property.
Sections and combinations
After naming your experiment and picking your editor page, you’ll see the experiment details page in draft mode. From here, you can create new variants. Creating variants for MVT uses the same process as creating A/B variants, except that you can create variants for each section.
A section is a single element of a web page (e.g. a button or an image) that is modified to create variants. An A/B test contains one section (with one or more variants). In a multivariate test, multiple sections are tested at the same time (e.g. a button and an image).
After clicking CREATE, Optimize will present you with the section and combination picker. Optimize uses the Editor page that you entered above as the Original and creates two sections (A and B below) to get you started. Multivariate tests require at least two sections, each with at least two variants (the original and one new variant).
Create a new variant
To create a new variant:
- Click + NEW VARIANT under the appropriate section.
- Enter a Variant name (e.g. "Headline 1").
- Click ADD.
- Repeat as necessary.
- When finished, click SAVE.
If you create one variant in the Headline section (“Headline 1”) and two variants in the Image section ( “Image 1” and “Image 2”), you’d have a total of six combinations.
Hover over a variant and click the Editor button  to open it in the Optimize visual editor. Delete a variant by clicking on the more menu  next to the variant’s name, then on delete.
Using the visual editor
After being directed to the Editor Page URL you entered above, you'll see the Optimize visual editor which consists of two components: the app bar (at the top of the page) and the editor panel (in the lower right). To create a variant:
- Click on the page element that you wish to edit.
- Use the editor panel to edit it.
- Click SAVE.
- Click DONE.
- Repeat this process for each of your variants.
Learn more about Using the visual editor.
Create a new section
To create a new section:
- Click + NEW SECTION.
- Enter a Section name.
- Click ADD.
- Repeat as necessary.
- When finished, click SAVE.
Change section and variant names by clicking on them and editing the name field.
Click the COMBINATIONS tab to see all of the combinations that will be tested. From here, you can preview each combination as needed to make sure your pages are rendering correctly. On the COMBINATIONS tab, click the Preview button  next to any variant to preview it in a new tab.
Learn more about Preview mode.
Configure your experiment
Now that you've created variants, configure your objectives and targeting in experiment settings.
Learn more about how to Configure objectives and targeting.
Start your experiment
Click START EXPERIMENT. When the status field says Running, your first experiment is running live on the web. (Most updates happen within a minute.)
How long should your experiment run?
Keep an experiment running until at least one of these conditions has been met:
- Allow two weeks to pass to take into account any cyclical variations in your traffic during the week.
- At least one variant has a 95% Probability to beat baseline.
To monitor a running experiment or see the results of a concluded experiment, click the Reporting tab at the top of the experiment detail page. The report page is broken down into a series of cards that contain data about your experiment, including its status and how your variants perform against your objectives.
Unlike A/B and redirect tests where you might deploy a single variation, MVT allows you to deploy a combination of variations. The Optimize MVT report will help you answer the question “Which combination, if any, should I deploy to realize the greatest improvement?”
The first card in the Optimize MVT report is the summary card which displays the status and performance of the experiment. It includes the following:
- Experiment status – Draft, running, ended.
- Leading combination – The current leader.
- Improvement – The difference in conversion rate between the variant and the baseline.
- Probability to beat baseline – The probability that a given variant will result in a conversion rate better than the original's conversion rate.
- Probability to be best – The probability that a given variant performs better than all other variants.
- Experiment sessions – Number of experiment sessions since it was started, displayed as a total and a graph over time.
Definitions for these and other testing terms can be found in the the Optimize glossary.
The values in parentheses under improvement are modeled metrics showing the expectation and 95% intervals. Based on the Optimize model, there's a 95% chance that the actual improvement (or conversion rate) will fall within the range in parentheses.
Objective detail card
The second card in the Optimize MVT report is divided into two parts. The top part displays the performance of your combinations against the objective that you picked during setup (Goal 1 completions, in the example below). You can toggle combinations on and off by clicking the blue checkbox next to each combination name.
The chart at the bottom of the card is your conversion rate over time, which graphs the performance of your combinations. The shaded areas in the graph at the bottom of the objective detail card represent the performance range that your original and combinations are likely to fall into 95% of the time. The line in the middle of a range shows the prevailing direction of your experiment.
You should expect the ranges to converge/shorten over time as more data is accrued. However, that's true of the conversion rate range for both the original and the combination, so a narrowed range isn't directly linked to a higher Probability to beat baseline. The conversion rate range should narrow over time even if a variant's Probability to beat baseline is low/zero.
All conversion rate metrics in Optimize are modeled, and will be different than what you see in the Analytics Content Experiments interface. The actual observed conversion rate is not shown in the Optimize reports, but is exposed in the Analytics Experiments report as the conversion rate metric, which is calculated as conversions divided by experiment sessions.
Reports in Analytics
In addition to the reports included in Optimize, you can also see Optimize reports in Google Analytics. Sign in to Google Analytics, select the Reporting tab and select Behavior > Experiments in the report navigation.
Learn more about Optimize reports.