Introduction to Optimize reports
To monitor a running experiment or see the results of a concluded experiment, click the Reporting tab at the top of the experiment detail page. You can also view your results in Google Analytics.In this article:
Elements of an Optimize report
Optimize reports are broken down into a series of cards that contain data about your experiment, including its status and how your variants perform against your objectives. The cards are organized with summary information at the top, and more detail as you scroll down the page.
The first card in the Reporting tab is the Summary card, which displays the experiment status and a summary of the results. The information you see here is determined based on the experiment’s primary objective.
- Improvement (a.k.a. credible level of improvement) – The difference in the modeled conversion rate of the variant and the baseline, for a given objective. This is the likely range in which your conversion rates will fall. Optimize uses Bayesian analysis to determine how the variants will perform in the future and improvement is the Bayesian equivalent of a confidence interval.
- Probability to be best – The probability that a given variant performs better than all other variants.
- Probability to beat baseline - The probability that a given variant will result in a conversion rate better than the original's conversion rate. Note that with an original and one variant, the variant's Probability to beat baseline starts at 50 percent (which is just chance).
- Experiment Session - Any session where the experiment executed and all future sessions for that user while the experiment is running, even if they didn’t see the experiment. Subsequent sessions are included in Experiment sessions to capture conversions that occur after a user is included in the experiment. Optimize objectives that occur during an Experiment session will be included in the experiment reporting and statistics.
Improvement overview card
The second card in the Reporting tab is the Improvement overview card, which compares the performance of the original to your variant(s), with their percent improvement against the experiment’s objective(s). Under each variant is the total number of Experiment Sessions to date.
Values are presented in a range which represent the credible interval of improvement. Green values performed clearly better than the original, while red values performed clearly worse for a given objective.
Click on a column heading to sort the table by that column. Click it again to switch from ascending to descending order.
Optimize 360 feature: Add Objective
Optimize 360 customers can click ADD OBJECTIVE in the upper-right of the Improvement overview card to add additional objectives. Click the "X" next to an objective name to remove it from the table.
Objective detail card
The third card displays the performance of your variants against an objective picked from the drop-down menu in the upper left (Pageviews, in the example below). Toggle variants on and off by clicking the blue checkbox next to its name. The chart at the bottom of the card graphs the performance of your variants over time.
The third card in the Reporting tab is the Objective detail card, which displays the performance of your variants against an objective picked from the drop-down menu in the upper left (Pageviews, in the example below). Toggle variants on and off in the chart by clicking the blue checkbox next to its name. The chart at the bottom of the card graphs the performance of your variants over time.
The colored areas in the graph at the bottom of the card represent the performance range that your original and variant(s) are likely to fall into 95% of the time. The line in the middle of a range shows the median value of the range.
At the start of an experiment there's greater uncertainty of each variant's performance, resulting in wider intervals (taller in the chart). In most cases, each variant's conversion rate range will narrow over time as the Optimize model takes more data into account, allowing for a better determination of how the variant will perform in the future.
Analytics view filters
The data used in Optimize reports is subject to any filters applied to the linked Analytics view. However, view filters are not taken into account when determining whether to serve an experiment. For example, if the experiment is linked to an Analytics view that excludes internal IP addresses, internal users will be excluded from your Optimize experiment data. However, those users may still see the experiment execute if they meet the targeting requirements of the experiment. Generally, it’s a good idea to link the experiment to a view with as few filters as possible.
Optimize data is pulled from the underlying Google Analytics data tables, and is not sampled. And, Optimize does not impose any sampling in the Optimize interface. This means that any data you see in Optimize is unsampled, regardless of whether you utilize Optimize or Optimize 360, or Analytics or Analytics 360.
Learn more about experiment dimensions and accessing your Optimize data in Google Analytics.