Run A/B experiments on native styles

Maximize your native performance by A/B testing different native styles using Native experiments. Test fonts, colors, layouts, and other updates in a new style against an existing style to see which one will perform better before making a change.

How it works: an existing native style is the control of the experiment. You open the existing style and then create the experiment style through the existing style’s settings. Note that the experiment style must be created in this way; you can’t use an existing style as the experiment style. After you create the experiment style and start it, you can analyze the styles’ performance and determine which settings you want to keep.

Native experiments can only compare two native styles on existing native placements. You can’t compare banner and native ads in the same ad placement.

If an experiment targets a control native style that mixes both programmatic and reservation traffic, your reservation traffic will be affected.

Run an experiment

  1. In DFP, click Delivery and then Creatives and then Native styles.
    Styles currently running an experiment have an “A/B experiment” label.
  2. Click an existing programmatic native style to use as the control.
    Programmatic native styles have a value of “Ad Exchange and direct sales” under the “Transaction availability” column.
  3. On the settings page for the native style, click Create A/B experiment, above the CSS table.
  4. Enter the settings for the experiment style. This experiment style will be tested against the existing style, or the control.
    • Name the experiment to differentiate it from the original style.
    • Select when the experiment will run.
    • Next to “Traffic allocation,” enter the percentage of impressions to allocate to the experiment style during the experiment. The rest will go to the original style.
      • For example, if you allocate 60% of impressions to the experiment style, the original style will get the remaining 40%.
      • Enter 50% for an equal allocation of impressions between the experiment and original style.
    • Change the HTML and CSS from the original native style with the changes you think might make the resulting native ads perform better. Learn more about using Change elements and Change template to update HTML and CSS, respectively.
  5. Click Start experiment.

Analyze your experiment and take action

After the experiment has run for two days, the system should have enough results. Review the data and decide if you want to refine the experiment's settings.

  1. Do one of the following to access your experiment:
    • Immediately after clicking Start experiment in step 5 above, a yellow message bar appears near the top of the screen. Click the here link.
    • Above the CSS table, click View A/B experiment.
    • Click Reports and then Experiments.
  2. From the list of experiments, click the Down arrow Down Arrow next to yours.
  3. (Optional) If the experiment is still running, pause it by expanding the “Running” dropdown and clicking Pause.
    When you pause an experiment, 100% of traffic will go to the original (control) native style.
  4. (Optional) Click Preview styles to see what a resulting ad would look like from each style.
  5. Review the data to see how the experiment is performing compared to the original native style.
    Remember to keep the traffic allocation in mind when analyzing the results. The allocation appears in the lower lefthand corner.
  6. After the experiment ends, all traffic is allocated to the original native style. At any time, you can choose to apply the experiment settings or keep the original style’s settings.
    • Use experiment: The original native style is updated to match the experiment style.
    • Keep original: The original native style retains its settings.
    After you click either of these, the original style receives 100% of traffic allocation, and the experiment native style is deleted.

Understand experiment results

Experiments display the following metrics along with a “+/-% of control” value, which helps you compare the performance between the experiment and control native styles.

An “Experiment revenue” of “$10,000 / +10.0% of control” means the experiment style is estimated to receive $10,000 in revenue, which is 10% higher than the estimated revenue for the original (control) style.
  • Experiment revenue
    Net revenue generated from Ad impressions served (with adjustments for Ad Spam and other factors). This amount is an estimate and subject to change when your earnings are verified for accuracy at the end of every month.
  • Experiment eCPM
    Ad revenue per thousand Ad impressions
    Ad eCPM = Revenue / Ad impressions * 1000
  • Experiment CTR
    For standard ads, your ad clickthrough rate (CTR) is the number of ad clicks divided by the number of individual ad impressions expressed as a percentage.
    Experiment CTR = Clicks / Ad impressions * 100
  • Experiment coverage
    The percentage of ads returned compared to the number of ads requested.
    Experiment coverage = (Matched requests / Ad requests)
Was this article helpful?
How can we improve it?