Reporting differences
The experiment data you see in Optimize may be slightly different than that same experiment’s data in Analytics. The primary causes for differences are reporting delays and the calculation methods for conversion rates.
Reporting delays
All metrics you see in Optimize are first processed by Analytics and then pushed to Optimize. The push to Optimize can take up to 12 hours. As a result, you will generally see more Experiment Users in Analytics than in Optimize. Additionally, when you end an experiment, data accrued between the last push to Optimize and your ending the experiment will not be pushed to Optimize, but will be available in Analytics.
Conversion rate calculations
The conversion rate ranges you see in Optimize, which you can see in both the Improvement overview and Objective detail cards, are a range of modeled conversion rates. Optimize is calculating the actual conversion rate for a variant. This value may not yet be represented by a variant’s observed conversion rate, especially early-on in the experiment. You can expect the future conversion rate of a variant to fall into the range you see in Optimize 95 percent of the time.
Conversely, the conversion rate metrics you see for the same experiment in the Analytics Content Experiment report are an empirical calculation of Conversions / Experiment Sessions. These two conversion rate values (the modeled conversion rate range in Optimize and the observed conversion rate in Analytics) are expected to be different, and we recommend that you use the values in Optimize for your analysis.