Run A/B tests on your Store Listing

To help optimise your Store Listing on Google Play, you can run experiments to find the most effective graphics and localised text for your app.

For published apps, you can test variants against your current version to see which one performs best, based on install data.

Tip: Before setting up a test, review the best practices for running effective experiments.

Experiment types

For each app, you can run one global experiment or up to five localised experiments at the same time.

Global (for graphics only)

Using a global experiment, you can experiment with graphics in your app's default Store Listing language. You can include variants of your app's icon, feature graphic, screenshots and promo video.

  • If your app's Store Listing is only available in one language: Global experiments will be shown to all users.
  • If you've added any localised graphic assets in a specific language: users viewing your app in that language are excluded from your app's global experiments. For example, if your app's default language is English and it has a localised feature graphic in French, users viewing your app in French will be excluded from the experiment (even if you're testing your icon).
Localised (for text and graphics)

Using a localised experiment, you can experiment with your app's icon, feature graphic, screenshots, promo video and/or your app's descriptions in up to five languages. Experiment variants will only be shown to users viewing your app's Store Listing in the languages that you choose.

If your app's Store Listing is only available in one language, localised experiments will only be shown to users viewing your app in its default language.

Step 1: Create an experiment

You can create an experiment using your Play Console. When you're ready to review and apply results, you can use your Play Console or the Play Console app.

Global
  1. Sign in to your Play Console.
  2. Select an app.
  3. On the left menu, click Store presence > Store Listing Experiments.
  4. Click New experiment.
  5. Under 'Global', click Create.
  6. Proceed to the instructions under 'Step 2: Set up your experiment'.
Localised
  1. Sign in to your Play Console.
  2. Select an app.
  3. On the left menu, click Store presence > Store Listing Experiments.
  4. Click New experiment.
  5. Under 'Localised', choose a language.
  6. Click on Create.
  7. Proceed to the instructions under 'Step 2: Set up your experiment'.

Once you've made a localised experiment live in one language, you can add up to four other languages. To add more experiments:

  1. Go to back to your Store Listing Experiments page.
  2. Select New experiment.
    • If you can't select New experiment, make sure that you don't have an experiment saved in draft. Any draft experiments need to be live before you add new languages.

Step 2: Set up your experiment

After you've created an experiment, you can set up variants and choose the attributes that you want to test.

To set up your experiment:

  1. Follow the on-screen instructions to add your targeting information, attributes and variants. For more information and tips, review the table below.
  2. To begin your experiment, go to the top of the page and click Run experiment. To finish setting up your experiment later, click Save.
See field descriptions, examples & tips
Field Description Examples & tips
Name
  • Your experiment name is only visible on the Play Console to identify the experiment; it isn't visible to users.
  • 'Bright icon experiment'
  • 'Logo feature graphic'
  • 'Short description with new slogan'
Audience
  • The percentage of users that see a variant of your experiment.
  • Your audience percentage will be divided equally between your experiment variants.
  • If you type 30% as your audience, the remaining 70% of visitors to your Store Listing page will see your page's current version.
  • If you have a 30% audience and two variants in your experiment, each variant will be shown to 15% of users.
  • During the course of an experiment, each user will only see a single variant or your page's current version.
Attributes
  • Select the item type you want to test compared to your current listing.
  • To run experiments most effectively, test one attribute at a time.
  • You can only test your short description and full description during a localised experiment.
  • If you're testing graphic assets, make sure to follow size and file type requirements.
Variants
  • To add a new version to your experiment, click Add another variant.
  • You can add up to 3 variants per experiment, in addition to your current version.

Step 3: Review & apply results

After you’ve set up experiments, you'll see the following details on your Store Listing Experiments page. To get more details, select an experiment.

Live experiments

  • Experiment: Name of your experiment
  • Status: Number of variants being served to a percentage of users
  • Start date: When your experiment started
  • Results: Experiment results or data status

Completed experiments

  • Experiment: Name of your experiment
  • Type: Global or localised
  • Start date: When your experiment started
  • End date: When your experiment ended
  • Results: Applied or not applied
Using the Play Console website
  1. Sign in to your Play Console.
  2. Select an app.
  3. At the left menu, click Store presence > Store Listing Experiments.
  4. Select the experiment that you want to review.
  5. Next to 'Metric', select a metric.
  6. View your results.
    • To apply a variant that outperformed your app's current version, select Apply winner. You can review the changes to your Store Listing before the changes go live.
    • To keep your current version, select Keep. This will update your Store Listing to use your current version and end the experiment.
    • If your experiment results in a tie, select Stop experiment.
    • Note: Depending on the metric that you select, the performance results of individual variants may differ. For best results, review all available metrics before deciding on which variant to use.
Using the Play Console app
  1. Open the Play Console app Console app.
  2. Select an app.
  3. Scroll down to the 'Experiments' card. For more information on your app's experiments, tap View details. From the detailed view, tap on a variant to see the audience, current and scaled installs for that experiment.
    • To apply a variant that outperformed your app's current version, select Apply winner. You can review the changes to your Store Listing before the changes go live.
    • To keep your current version, select Keep current. This will update your Store Listing to use your current version and end the experiment.
    • Note: Performance results are based on first-time installers. To view results based on additional metrics, visit the Play Console website.
See descriptions of statistics, metrics & examples

After you select an experiment, you can view user and install metrics that summarise how each variant performed.

User metrics

Metric Definition
First-time installers
  • Unique users who installed your app for the first time during the experiment. Data is scaled to account for audience share.
Retained installers (1-day) – only available on the Play Store website
  • Unique users who installed your app for the first time and kept it for at least 1 day following installation during the experiment. Data is scaled to account for audience share.

Results

You can view metrics that show the results of your experiment in two different ways:

  • Current: Number of unique users
  • Scaled: Number of unique users divided by audience share

If you want to review absolute data on installers, use current data. If you want to review data that’s been scaled to account for different audience shares (example: If 90% of your audience saw one version and 10% of your audience saw another), use scaled data.

Metric Definition & examples
Installers
  • Number of unique users who installed your app for the first time.
Retained installers
  • Number of unique users who installed your app for the first time and kept it for at least one day following installation.
Performance
  • Estimated change in install performance compared to the current version. There is a 90% chance that the variant would perform within the displayed range over time. Since the range is based on a variant's performance, these numbers will vary during the experiment.
  • The average between your variant's high and low approximate change in install performance represents your variant's estimated change in performance.
  • Example: if one of your variants had a performance range of +5% to +15%, the most likely change in performance would be the middle number between the two, about +10%.
  • Performance will only be displayed once your experiment has enough data. In general, as an experiment has more time to run and collect data, a variant's performance range will become more narrow and accurate.
Installs (scaled) – deprecated
  • Number of installs during the experiment divided by audience share.
  • Included for experiments started prior to 24 January 2019.
  • For example, if you ran an experiment with two variants that used 90%/10% audience shares and the installs for each variant were A = 900 and B = 200, the scaled installs would be shown as A = 1000 (900/.9) and B = 2000 (200/0.1).
Installs (current) – deprecated
  • Number of installs that are still installed today.
  • Included for experiments started prior to 24 January 2019.

Install metrics (deprecated)

The following metrics were used for experiments started prior to 24 January 2019. Experiments that started prior to this date still include these statistics, but new experiments only use first-time installers and retained installers (1-day) user metrics.

Metric Definition & examples
Installs on active devices
  • Number of active devices on which each variant of the application is currently installed, scaled up to compensate for the different audience levels.
Installs by user
  • Number of unique users who installed each variant of the app in a given day, scaled up to compensate for different audience levels.
Uninstalls by user
  • Number of uninstalls of each variant of the app in a given day, scaled up to compensate for different audience levels.

Sign up for experiment notifications

To receive notifications on the Play Console and by email when your experiments are complete, you can set up your notification preferences.

To learn more about email notifications, go to manage your developer account information.

Was this article helpful?
How can we improve it?