Explore the Anomalies card (Beta)

This feature is in Beta
Features in Beta phase might not be available in your network. Watch the release notes for when this feature becomes generally available.

You can review anomalies of underperforming or overperforming metrics on the Anomalies card.

About anomalies

Anomalies are unexpected changes in metric performance. Since they defy expectations, they might be worth investigating further. You can use the Anomalies card to filter and explore these events.

Understand impact levels

Anomalies are categorized into high, medium, and low impact levels. Think of impact levels as weighted categories. Each category (high, medium, and low) considers the amount of change and the percentage of traffic impacted. 

If the change and traffic meet the highest threshold, the anomaly is labeled "High impact." If there is only a minor change to a small amount of traffic, the anomaly is labeled "Low impact." 

Check supported metrics

On the card, you can review anomalies for the following metrics:

  • Impressions
  • Clicks
  • eCPM
  • Estimated revenue
  • Matched requests
  • Ad requests
  • Ad Exchange eCPM
  • Ad Exchange clicks
  • Ad Exchange estimated revenue
  • Ad Exchange impressions
  • Unfilled Impressions
  • Programmatic unfilled impressions*
  • Programmatic queries*
  • Programmatic match queries*
  • Programmatic match rate
  • Yield group eCPM
  • Yield group estimated revenue
  • Yield group impressions
  • Yield group mediation passbacks

Note: Metrics marked with an asterisk (*) are unique to the Anomalies card and not available separately in Ad Manager reporting.

Use the Anomalies card

  1. Sign in to Google Ad Manager.
  2. On the Overview Home dashboard, navigate to the "Anomalies" card.(Help me add a data card)
    Homepage dashboard overview showing the Anomalies card
  3. Next to "Filter by," select High impact level only, Negative impact only or both.
  4. To review specific metrics, click All metrics Expand and make your selections.
  5. Review the card's data:
    Detected anomaly Date Metric Deviation Impact level
    A description of the anomaly The date the anomaly occurred

    The metric affected by the anomaly 

    View supported metrics

    The difference between actual and expected performance, based on historical data

    A rating of low, medium, or high based on how anomalous (how far from bounds) the point is, and how large the impacted segment is (the dimensional slice based on weekly impressions)

    A higher level indicates greater severity.

  6. To review more details for a specific anomaly, click View details.
  7. In the detail view, make your selections:
    • To review metrics, next to "Filter by," select All metrics or Affected metrics only.
      Metrics that show "Within expected range" recorded no anomalies for the given time period. 
    • To view child anomalies, under "Anomaly hierarchy breakdown," click Expand Expand.
    • To change the date range of the time series chart, next to "Full range," click the dropdown Expand and select a range.
      Tip: Hover over the chart for details on a specific date.
    • To run a report, click View in Reporting.
    • To return to the Overview Home dashboard, click Close Dismiss or Cancel

FAQ

Causes, impact levels, and expected values

What causes an anomaly?

Many factors can cause an anomaly. Here are a few examples: 

  • Global events such as holidays can have a significant impact.
  • Market factors may shift the end user behavior on a site, causing an anomaly. 
  • Changes made by the publisher may cause a deviation from expected values. 

Not all of these changes are visible to Google. As a result, any attribution of an anomaly to a specific action or event isn’t available.  

What’s the difference between impact levels?

Impact is a product of how extreme the anomaly is and how significant the dimensional slice is to the network as a whole. 

A very large anomaly on a very small section of the network’s traffic will have low impact. Similarly, a small anomaly on a large segment of traffic will also have low impact. However, if the anomaly is sizable and the segment of traffic is sizable, the impact will be large.

How is the expected value calculated or predicted?

The expected value is calculated based on historical data. It takes into account seasonality and the volatility of the data. 

What is the difference between "Expected value" and "Expected range"? What is the range window as a percent?

The expected value is what the algorithms calculated. The metric is based on historical data. However, the confidence in that value may vary based on the volatility of the historical data. 

The expected range reflects the confidence in the calculated expected value. Anything that falls outside the expected range will be an anomaly. If the historical data is very stable, the expected range will be small and small deviations will be anomalies. If the historical data is very volatile, the expected range will be large and only larger deviations will be reported as an anomaly.  

How can I get an anomalous metric back into the expected range?

The root cause of an anomaly varies from case to case. Some are caused by global events, others are caused by user changes. 

The details on the card can help you find issues and analyze a complex dataset faster. 

Schedules and date ranges

How frequently is the data in the card refreshed?

Daily.  

What’s the time range of the anomaly analysis?

Currently, the analysis looks at the previous 100 days.

How long does an anomaly stay on the card after being detected?

As with other data cards, the date range filter on the Overview Home dashboard affects the data shown. Anomalies within the date range specified in the date filter will show on the anomalies card.

If the anomaly is only a temporary spike, will it remain on the card?

The anomaly algorithm uses only historical data. If a point is anomalous, it will show as an anomaly unless the data changes, such as for a data correction.

Can the anomaly be hidden or skipped if there is no action required or it’s irrelevant?

No, but you can select a more recent date range, so earlier anomalies no longer show on the card.

Permissions and settings

What user roles can view the card?

The card respects teams filtering. If a user has access to the Home page, they can view this data, except for any dimensions filtered by teams.

Can publishers add additional metrics or dimensions?

You can select from the metrics available in the card. Other metrics and dimensions aren’t currently available. Please provide feedback for any additional metrics or dimensions you’d find useful.

Can a publisher turn off the Anomaly card for their network?

As with other cards on the Overview Home dashboard, you can remove the Anomaly card, and add the card back later.

Data focus, logic, and limits

How does the data shown on the graph differ from forecasting data?

Forecasting focuses on longer term predictions for reservation traffic in an attempt to help users more effectively monetize their site. 

Anomaly detection focuses on all traffic and a much broader set of metrics in an attempt to more quickly communicate unusual behavior. 

What is the priority logic of the anomalies shown on the card, and can it be changed?

Similarly, if there are multiple anomalies detected in the network, which anomalies will show on the top of the card?

Anomalies are clustered into anomalous events on a given day. All anomalies that we deem part of the same event will show as a single line on the card. The anomaly shown will be the highest impacting metric on the largest dimensional slice (the slice that corresponds to the largest portion of network traffic). 

To focus on anomalies for specific metrics, use the metric filter at the top of the card. To review all the metrics and other dimensional slices related to the anomalous event, click View details.

What’s the limit of the anomalies shown on the card? Would it suppress some anomalies for others?

The card has a minimal threshold for anomalous events based on the impact metric. Metrics with impact below that threshold don’t show. In the detailed view, the threshold will be lowered based on the impact of the anomaly being viewed.

 

 

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
5933001433279139037
true
Search Help Center
true
true
true
true
true
148
false
false