Search
Clear search
Close search
Google apps
Main menu
true

Run experiments

Verify how a change will perform in your network before applying it

What are experiments?

Experiments let you use actual network traffic to test how applying a change will impact revenue. You can compare the impression traffic allocated to an "experiment" group to a control group without the rule changes.

To get started, learn how to find opportunities.

Run an experiment

  1. Sign in to DoubleClick for Publishers.
  2. Find an opportunity for which you want to run an experiment, or experiment with native styles.
  3. Click Experiment.
  4. Edit the name of the experiment, or use the default name.
  5. Set a start date and an end date for the experiment. An experiment needs to run for at least 7 days to improve the possibility for conclusive results.

    You can schedule an experiment to start immediately, or specify a later date. All experiments start at 12:00 am and end at 11:59 pm on the scheduled dates in your local time zone, and data is refreshed daily. If you set the start date to the current day, the experiment will start within the next hour.

  6. Set the percentage of impression traffic to allocate to the experiment. We recommend using the default value of 10% to have a useful comparison with the control group.
  7. Click Start experiment.

Find and evaluate your experiments

When you run an experiment, your experiment appears in the "Experiments" section, and the opportunity for this experiment disappears from the "Opportunities" page. Once an experiment has begun, you can't change it. Each Ad Exchange rule can only have one experiment running on its targeted inventory at a given time.

  1. Sign in to DoubleClick for Publishers.
  2. Click Reports and then Experiments.

    If you're accessing this feature from Ad Exchange instead of DFP, click Rules and then Optimization and then Experiments.

  3. Click an experiment to see more information. Only active, paused, or completed experiments are listed.

    • Status: Indicates whether the experiment is still running, paused, or completed. The start and end dates specified, and time remaining. There are four possible outcomes: control group is winning, experiment group is winning, insufficient data, or sufficient data but inconclusive results.

      Why does my experiment display "Result is inconclusive"?
      Experiment results may be inconclusive if there's no difference in revenue between the experiment group over the control group, the improvement is statistically insignificant (less than 0.5% difference), or if there isn't sufficient data yet.
       
      Why does my experiment display "Insufficient data"?
      Experiments require a significant amount of data before a recommendation is possible. Run your experiments for at least 7 days to improve the chance for conclusive results.
       
      What determines the winner of an experiment?
      The group with a higher revenue lift wins the experiment.

      For native style experiments, the group with a higher click-through rate (CTR) wins the experiment. A high CTR is an indication of long-term performance, even if short-term revenue for two native styles is similar.

    • Settings: Review the impression traffic allocated to this experiment. 

    • Data collected from the experiment is displayed for your review after an experiment has ended. Click each metric to compare the performance of the control and experimental groups in graph form.

      Coverage: The match rate, calculated as matched queries / total queries.
      eCPM: The average eCPM.
      CTR: The average click-through rate.
      Revenue: The group with the highest revenue lift is considered the winner of an experiment.

  4. Decide how to proceed with the experiment. If you make no selection, the experiment continues until its scheduled completion date. You can come back to view the results and make a decision at a later time.

    • Click Pause to stop an experiment from running. You can resume the experiment at any time.
    • Click Use experiment to end the experiment and implement the changes for all impression traffic immediately.
    • Click Keep original to stop the experiment and remove it. We'll ask for feedback to help us improve our future recommendations, and we won't suggest this particular experiment to you again.

    Once you apply, decline, or delete an experiment, it no longer appears in the "Experiments" section of Reports.

Was this article helpful?
How can we improve it?