Search
Clear search
Close search
Google apps
Main menu

Differences in Optimize and Analytics reports

Why the results you see in Google Analytics may differ from what you see in Optimize

Results reported in Optimize may differ from results reported in Google Analytics. This is because Analytics metrics are calculated differently than in Optimize.

Differences between Optimize and Google Analytics

The experiment data you see in Optimize may be slightly different than that same experiment’s data in Analytics. The primary causes for differences are reporting delays and the calculation methods for conversion rates.

Reporting delays

All metrics you see in Optimize are first processed by Analytics and then pushed to Optimize. The push to Optimize can take up to 12 hours. As a result, you will generally see more Experiment Sessions in Analytics than in Optimize. Additionally, when you end an experiment, data accrued between the last push to Optimize and your ending the experiment will not be pushed to Optimize, but will be available in Analytics.

Conversion rate calculations

The conversion rate ranges you see in Optimize, which you can see in both the Improvement overview and Objective detail cards, are a range of modeled conversion rates. Optimize is calculating the actual conversion rate for a variant. This value may not yet be represented by a variant’s observed conversion rate, especially early-on in the experiment. You can expect the future conversion rate of a variant to fall into the range you see in Optimize 95 percent of the time.

Conversely, the conversion rate metrics you see for the same experiment in the Analytics Content Experiment report are an empirical calculation of Conversions / Experiment Sessions. These two conversion rate values (the modeled conversion rate range in Optimize and the observed conversion rate in Analytics) are expected to be different, and we recommend that you use the values in Optimize for your analysis.

Why do Optimize and Content Experiments report different leaders?

Optimize and Analytics use different underlying statistical models used to calculate leaders. In many instances they can return different results. We recommend using Analytics experiment reports as a reference point and for directional data and using Optimize results to decide when to end an experiment and which variant is the leader.

Related resources

Was this article helpful?
How can we improve it?