Get started with performance insights

This article is for new Play Console beta users. If you're looking for information about classic Play Console, go to the classic Play Console Help menu.

To learn more about the new Play Console beta, read our launch blog post.

Performance insights is a new feature in Android vitals designed specifically for game developers, and other developers with apps that use native code. Performance insights is powered by Android Performance Tuner, a new plug-in to supercharge Android vitals for game developers. 

This article provides introductory information on performance insights, where and how they’re displayed in the Play Console, and how you should interpret them:

First-time use

Data collection starts when you publish your integrated game on Play and users start installing and using it. Once the amount of data reaches a minimum threshold, we’ll display it in the Play Console (Quality > Android vitals > Performance > Insights).

While you’re waiting for the data to display:

  • You’ll receive a prompt to set a target frame rate. You can change this at any time. 
  • You can review your quality levels and fidelity parameters on the Overview page, or on any Details page, by selecting Inspect quality levels

If you don’t see either the Overview page or a message stating that you need to wait until enough  data is collected, go to Troubleshoot Android Performance Tuner issues and FAQs.

Understanding the Overview page

Summary metrics

The top metrics (slow frames, total frames, and total sessions) provide a summary of how your game is doing and the size of the dataset being used to generate the insights and metrics.

The “Slow frames” metric in the Overview summary is an absolute figure:

  • Slow frames (%): number of frames that were slow, divided by total frames

This absolute number helps you to understand and track your user experience over time. However, slow frames measured with Android Performance Tuner are not actionable unless they can be tied to an issue. You can learn more about slow frames and how they’re calculated. 

Device model issues and annotation issues

Issues enable you to take action on your frame time performance. An issue occurs if a device model or annotation does not consistently achieve your frame time target. To identify an issue, we compare your 90% percentile frame times with a predetermined threshold for slow frames. This threshold is derived from your target frame time. You can go to Understand more about Android Performance Tuner to learn more about issues.

The “Impact” metric for each issue shows you what proportion of your total slow frames is associated with that issue. This can be thought of as the set of slow frames that are actionable (i.e. where a potential cause has been identified). Note that some slow frames could be associated with both device model issues and annotation issues, so the total sum may be over 100%.

Time frame selector

You can select three time frames, which are defined as follows:

  • Today: From UTC until now (a fraction of a day)
  • Yesterday: From UTC day-1 to UTC today (a full 24-hour time period)
  • Last 7 days: From UTC day-7 to now (six full days plus 'today')

Device model insights

Device model insights contains three sections:

  • Device model chart
  • Device model issues table
  • Device model opportunities table
Device model chart

The device model chart provides a complete view of your frame time performance for all the device models that are reporting frame times. To understand the chart, note that:

  • Each device model is represented by a distinct circle on the chart in any given quality level. 
  • The size of the circle corresponds to the number of sessions counted on that device model. The scale is continuous.
  • The frame time for each device model x quality level is plotted on the graph, using the 90% percentile frame time for that combination.
  • Your target frame time is shown on the chart as a line.
  • Your target range is bounded by your slow frame threshold (on the right) and your fast frame threshold (on the left). Anything to the right of the shaded section is an issue, and anything to the left is an opportunity.

You can search for particular device models from the top right corner of the chart, or browse device models by hovering over the chart and clicking.

Notes:

  • Device metrics are aggregated to the variant level – a more granular level than device model. A variant occurs when the same model could have more than one spec (such as RAM or SoC) This means that searching for a device model by name in the chart may return more than one match on the same quality level. However, the underlying specs for each of the devices shown will be different, which you can see by drilling down to the device issues. 
  • If a device model has sessions on more than one quality level, it may be represented in more than one row in the chart. For more information on how this can occur, go to Troubleshoot Android Performance Tuner issues and FAQs.
Device model issues table

The issues table shows the total impact of all underperforming device models on each quality level. It’s an aggregation of the circles on the “Device model” chart that are associated with device models on that quality level which are slower than your slow frame threshold. Each quality level has its own row, unless there are no underperforming devices on that level, in which case no row is shown.

There are two slow frames metrics:

  • Slow frames: The absolute impact of the underperforming device models on each quality level. The quality level with most slow frames is the top device model issue shown in the “Overview” section.
  • Slow frames (%): The number of slow frames on underperforming device models on this quality level, divided by all frames on this quality level. This helps you understand the performance for this level as a whole, that is, what proportion of frames on this level are ‘slow.’

You can use both metrics for prioritization. For example, you can prioritize based on the absolute number of slow frames, or you can focus on “slow frames %” if you anticipate a shift in user mix over time.

The device model count columns provide an early diagnostic of how to address the issue. If the number of underperforming device models is very close to the total number of device models on this quality level, this signifies that the quality level as a whole is underperforming. In this case, you may want to review the quality level’s fidelity parameters or even consider whether this particular quality level should exist. If the number of underperforming device models is much lower than the total, then you are unlikely to want to touch the quality level itself, only the underperforming device models.

You can drill-down into a given quality level by clicking on its row in the table, to navigate to the Issue details page.

Note: As with the device model chart, a device model may be represented in more than one row in the table, if it has sessions on more than one quality level. Go to Troubleshoot Android Performance Tuner issues and FAQs to learn what might be causing this.

Device model opportunities table

The opportunities table, like the issues table, is a sum of the relevant rows on the chart, for device models that exceeded the fast frame threshold. The first two metrics are different: instead of “slow frames,” a “fast frames” metric is displayed. The logic is the same, and similar to device model issues, you can compare the last two columns to decide whether to adjust the overall quality level, or just the device models.

You can go to Understand more about Android Performance Tuner to learn more about opportunities.

Tip: You can drill-down into a given quality level by clicking on its row in the table, to navigate to the Opportunity details page.

Annotation insights

Annotation insights contains two sections:

  • Annotation chart
  • Annotation issues table
Annotation chart

The annotation chart is very similar to the device models chart. It provides a complete view of your frame time performance for all the annotations on which frame times were reported, including issues and opportunities.

  • Target frame time and frame range are shown on the chart.
  • Each circle on the chart corresponds to a specific annotation type:value. Annotations are defined and counted at this granularity. The size of the circle corresponds to the number of sessions on that annotation value. 
  • The frame time for each annotation x quality level is plotted on the graph, using the 90th percentile frame time for that combination.

You can search for particular annotation types or values from the top right hand corner of the chart, or browse them on the chart by hovering and clicking.

Annotation issues table

The annotation issues table shows the total impact of all underperforming annotations on each quality level. It’s an aggregation across the circles on the annotation chart associated with annotations which are slower than your slow frame threshold.

Here’s what you should know about the annotation Issues table.

  • Annotation issues are defined at annotation type:value x quality level granularity.
  • The parent row for each underperforming annotation aggregates across all quality levels for that annotation. Expand the parent row to see the breakdown by quality level.
  • Slow frames: The absolute impact of the underperforming annotations. The annotation with the most slow frames is the same as the top annotation issue shown in the Overview.
  • Slow frames (%): The number of slow frames on underperforming annotations, divided by all frames on the underperforming annotations. This helps you understand the performance for this annotation as a whole (that is, what proportion of frames on the annotation are ‘slow’).
  • Frame time is provided for each annotation issue, along with GPU time if you’re using a game engine that supports it (learn more about frame time). This helps you understand how slow the annotation was at the 90th percentile, and whether the underlying cause was due to CPU or GPU constraints.

You can go to Understand more about Android Performance Tuner to learn more about opportunities.

Understanding the Details pages (device models only)

Issue details

The Issue details page helps you learn more about the underperforming devices on a specific quality level and decide how to take action. It can help you identify issues with the parameters on the quality levels, or with the devices on the quality levels:

 

Issue

How it’s identified

Suggested next step

The quality level as a whole is not working well.

Most/all device models on a given quality level are underperforming

(Overview or Device issue detail page)

Work at quality level. Change fidelity parameters for the quality level or remove it completely and map all devices to a new (lower) quality level

Specific device models are on too high a quality level.

Subset of device models on a given quality level are underperforming, but not all 

(Overview or Device issue detail page)

Ultimately, the quality level for these device models probably needs to be lowered, but drill deeper to understand the best way to optimize

The problem can be isolated to specific device specs.

Strong correlation between certain specs and issues 

(Device issue detail page)

Work at device-spec level 

Example: Test a specific GPU, move all devices with a given GPU down one quality level.

The problem cannot easily be isolated to a specific spec or specs.

No clear correlation between certain specs and issues

(Device issue detail page)

Work at device model level

 

The Issue details page contains three sections:

  • Summary
  • The device spec breakdown table
  • The device model breakdown table
Summary

The data shown in the issue summary should correspond to the data for that quality level on the Overview page in the device model issues table.

Device spec breakdown table

If there are many underperforming device models, troubleshooting device model issues can be difficult. The device spec breakdown table helps you to identify if optimizations exist at the device spec level. It shows the distribution of underperforming devices by different device attributes.

  • Impact is the relative contribution of the devices on that spec with slow frames, to the total number of slow frames from underperforming devices on this quality level. The larger the number, the more slow frames are associated with these devices.
  • Slow frames (%) describes how that device spec performs in aggregate on this quality level. 
  • Slow frames (%) vs quality level average: This shows how slow frames (%) compares to the overall % slow frames on this level. If a spec outperforms this value, it will have a green bar showing to the left. If a spec lags this value, it will have a red bar showing to the right. 

Important: The impact column should not be used on its own for prioritization at the spec level. Even if a device spec contributes to a lot of slow frames (high impact), this isn’t sufficient information to conclude that the spec itself performs badly, without knowing how many frames on this spec there were in total. The slow frame metrics answers this question. Examples:

  • If a spec has high impact, but does not perform significantly worse than the quality level as a whole (low slow frames (%)), then there is unlikely to be an opportunity in optimizing for the spec.
  • Conversely, if a spec performs badly (high slow frames (%)) then it may be sensible to optimize for it directly even if it isn’t the one with the biggest impact.
Device model breakdown table

The device model breakdown table lists all underperforming devices on this quality level. You can download the information in this table. The downloaded dataset includes additional device metadata that's not not displayed in the user interface.

  • Affected sessions is a count of all the sessions on this device on this quality level. This count captures all sessions regardless of how many slow frames they contained. It is possible that a small number of sessions contained all the slow frames.
  • Slow frames (%) shows the proportion of frames on this device model and quality level that were slow. By definition of an underperforming device model, this is at least 10%.
  • Frame time is provided for each device model, along with GPU time if you are on a game engine that supports this. This enables you to understand how slow this device model was at the 90th percentile, and whether the underlying cause was due to CPU or GPU constraints.
Was this helpful?
How can we improve it?