Core Web Vitals report

Fix poor user experiences on your site

The Core Web Vitals report shows how your pages perform, based on real world usage data (sometimes called field data). You can read more about this initiative on the Google Search Central blog.


Why page performance matters
  • Longer page load times have a severe effect on bounce rates. For example:
    • If page load time increases from 1 second to 3 seconds, bounce rate increases 32%
    • If page load time increases from 1 second to 6 seconds, bounce rate increases by 106%
  • Read case studies here.

Understand the report

The Core Web Vitals report shows URL performance grouped by status, metric type, and URL group (groups of similar web pages).

Only indexed URLs can appear in this report. The URLs shown are the actual URLs for which data was recorded (that is, data is not assigned only to a page's canonical URL, as it is in most other reports).

The report is based on three metrics: LCP, FID, and CLS. If a URL does not have a minimum amount of reporting data for any of these metrics, it is omitted from the report. Once a URL has a threshold amount of data for any metric, the page status is the status of its most poorly performing metric.

"No data available"

If you see a "No data available" screen, it means either that your property is new in Search Console, or that there is not enough data available in the CrUX report to provide meaningful information for the chosen device type (desktop or mobile).

If your property is new: The CrUX database gathers information about URLs whether or not the URL is part of a Search Console property, but it can take a few days after a property is created to analyze and post any existing data from the CrUX database.

You can run a live performance test for individual URLs using the PageSpeed Insights testing tool, the Chrome Lighthouse tool, or AMP Page Experience Guide (for AMP pages)


Navigating the report

  1. To see how URLs on your site perform, based on historical user data, toggle the Poor, Needs improvement, or Good tabs on the overview page chart.
  2. Click Open Report to see the mobile or desktop summary page, showing page performance numbers for each platform.
  3. Click a row in the table to see details about URL groups affected by the selected issue, including a set of example URLs.
  4. Click a URL in the Examples table of the issue details page to see more information about that URL and similar URLs.


Overview page

The overview page of the Core Web Vitals report breaks down the data by the device used to view the URL: Mobile or Desktop (tablet data is not included). All data is grouped by status (Poor, Needs improvement, or Good). 

Open the report for a specific device type to see more performance data for that type.

Summary pages for mobile or desktop

The top level report for a specific device type (mobile or desktop) shows the status and issues for all URLs on your site for which we have data. Click a row in the details table to learn more about that specific status + issue type combination.


The tabs above the chart show the current total of URLs in each status, as well as the number of issues in that status. Toggle the tabs to choose which statuses to show in the chart. The chart shows the count of URLs with a given status on a given day.

Why is the chart total less than the table total?
The chart counts each URL only once, for the slowest issue affecting that URL. The table, in contrast, counts every issue associated with a URL. So if a URL has one Poor and one Needs improvement issue, it is counted once as Poor in the chart totals, but is counted in both Poor and Needs improvement rows in the table.



The table groups URLs into rows by status + issue. Each row shows the validation state, a sparkline showing a simplified timeline of that row, and the number of URLs currently in that status + issue state.

A URL can appear in multiple table rows if it is affected by multiple issues.

Issue details page for mobile or desktop

Click a table row in the top-level summary page for mobile or desktop to open a details page for that (device + status + issue) combination. The details page shows the URLs and other details for the selected issue.


The issue details chart shows the count of URLs with that status + issue combination on a given day, as well as the total count of URLs currently affected by the selected status + issue.


The issue details table shows a set of example URLs known to be affected by the selected issue. Each example URL is one of a group of similar URLs.

The table includes the following information:

  • URL: Each row in the table represents a group of similar URLs.
  • For non-good status pages: The appropriate column below will be shown, depending on which issue you are examining. Note that a single URL can be affected by multiple issues, but only the column appropriate for the selected issue is shown.
    • Agg FID: 75% of page requests took this amount of time or less to be able to react to the first user input in the last 28 days.
    • Agg LCP: 75% of page requests took this amount of time or less to reach largest contentful paint in the last 28 days.
    • Agg CLS: 75% of page requests had this score or less for cumulative layout shift in the last 28 days.

Click an example URL to see some other pages in the same group, as well as additional information about the group, and a link to run an external test. The table has a limit of 200 rows.


Finding the status of a specific URL

The report is not designed find the status of a specific URL, but rather to see your site's performance as a whole, and troubleshoot issues affecting multiple pages on your site. If you want to see performance data about a specific URL, use an external test. Although you can drill down on a status and issue and see specific affected URLs, finding a given URL using the Core Web Vitals report can be challenging.

Report data sources

The data for the Core Web Vitals report comes from the CrUX report. The CrUX report gathers anonymized metrics about performance times from actual users visiting your URL (called field data). The CrUX database gathers information about URLs whether or not the URL is part of a Search Console property.

Status: Poor, Need improvement, Good

The labels Poor, Needs improvement, and Good are applied to a URL on specific device type.

URL status

A URL's status is the slowest status assigned to it for that device type. So:

  • A URL on mobile with Poor FID but Needs improvement LCP is labeled Poor on mobile.
  • A URL on mobile with Needs improvement LCP but Good FID is labeled Needs improvement on mobile.
  • A URL on mobile with Good FID and CLS but no LCP data is considered Good on mobile.
  • A URL with Good FID, LCP, and CLS on mobile and Needs improvement FID, LCP, and CLS on desktop is Good on mobile and Needs improvement on desktop.

If a URL has less than a threshold of data for a given metric, that metric is omitted from the report for that URL. A URL with data in only one metric is assigned the status of that metric. A URL without threshold data for either metric will not be on the report.

Status definitions

Status metrics are evaluated against the following boundaries:

  Good Needs improvement Poor
LCP <=2.5s <=4s >4s
FID <=100ms <=300ms >300ms
CLS <=0.1 <=0.25 >0.25


  • LCP (largest contentful paint): The amount of time to render the largest content element visible in the viewport, from when the user requests the URL. The largest element is typically an image or video, or perhaps a large block-level text element. This is important because it tells the reader that the URL is actually loading.
    • Agg. LCP (aggregated LCP) shown in the report is the time it takes for 75% of the visits to a URL in the group to reach the LCP state.
  • FID (first input delay): The time from when a user first interacts with your page (when they clicked a link, tapped on a button, and so on) to the time when the browser responds to that interaction. This measurement is taken from whatever interactive element that the user first clicks. This is important on pages where the user needs to do something, because this is when the page has become interactive.
    • Agg. FID (aggregated FID) shown in the report means that 75% of visits to a URL in this group had this value or better.
  • CLS (Cumulative Layout Shift): CLS measures the sum total of all individual layout shift scores for every unexpected layout shift that occurs during the entire lifespan of the page. The score is zero to any positive number, where zero means no shifting and the larger the number, the more layout shift on the page. This is important because having pages elements shift while a user is trying to interact with it is a bad user experience. If you can't seem to find the reason for a high value, try interacting with the page to see how that affects the score.
    • Agg. CLS (aggregated CLS) shown in the report is the lowest common CLS for 75% of visits to a URL in the group.

You can find recommendations on fixing these issues by running an external test.

URL groups

An issue is assigned to a group of URLs that provide a similar user experience. This is because it is assumed that performance issues in similar pages is probably due to the same underlying problem, such as a common slow-loading feature in the pages.

Fix issues

Non-technical users

  1. Prioritize your issues: We recommend fixing everything labeled "Poor" first, then prioritize your work either by issues that affect the most URLs, or by issues that affect your most important URLs. URLs labeled "Needs improvement" could be improved, but are less important to fix than Poor URLs.
  2. When you've sorted by priority, share the report with your engineer or whomever will be updating your URLs.
  3. Common page fixes:
    • Reduce your page size: best practice is less than 500KB for a page and all its resources.
    • Limit the number of page resources to 50 for best performance on mobile.
    • Consider using AMP, which almost guarantees good page loading on both mobile and desktop.
    • Use an external test to recommend fixes to your page.
  4. Test your fixes using an external test.
  5. When you think a particular issue is fixed, click Start Tracking on the issue details page in the Search Console Core Web Vitals report.
  6. Track your validation process.

Website developers

  1. Prioritize your issues: We recommend fixing everything labeled "Poor" first, then prioritize your work either by issues that affect the most URLs, or by issues that affect your most important URLs. URLs labeled "Needs improvement" could be improved, but are less important to fix than Poor URLs.
  2. We recommend reading the fast loading guidelines and the Web Fundamentals performance pages on for theory and guidelines on improving page speed.
  3. Use an external test to recommend fixes to your page.
  4. Test your fixes using an external test
  5. When you think a particular issue is fixed, click Start Tracking on the issue details page in the Search Console Core Web Vitals report.
  6. Track your validation process.

Additional useful resources:

My site status changed, but I didn't change anything

If you didn't make any changes in your site, but you see a big change in status for a lot of pages, it's possible that you had a borderline status for many pages, and some site-wide event pushed your pages over the edge: for example, your site traffic dramatically increased or the service that serves your image files experienced a latency change, either of which could slow your site down. A small, but site-wide, change might have been just enough to push a bunch of borderline Good pages into the Needs improvement category, or from Needs improvement to Poor.

Another possible, though less likely, reason is a large-scale change in clients. For example, a widely-adopted browser version update, or an influx of users over a slower network. Remember that performance is measured by actual usage data. You can check your logs to see if any browser, device, or location changes coincide with site status changes. 

Check your site traffic data during this period for any big swings, and also drill down into specific issues and look at the aggregate LCP/FID/CLS numbers for affected pages. If these numbers are just at the border for Poor/Need improvement/Good, it's possible that a small change has nudged them into a new status.


Sharing the report

You can share issue details in the coverage or enhancement reports by clicking the Share button on the page. This link grants access only to the current issue details page, plus any validation history pages for this issue, to anyone with the link. It does not grant access to other pages for your resource, or enable the shared user to perform any actions on your property or account. You can revoke the link at any time by disabling sharing for this page.

Exporting report data

Many reports provide an export button to export the report data. Both chart and table data are exported. Values shown as either ~ or - in the report (not available/not a number) will be zeros in the downloaded data.

Validate fixes

When you've fixed a specific issue in all of your URLs, you can confirm whether you fixed the issue for all URLs. Click Start Tracking to start a 28-day monitoring session to check for instances of this issue in your site. If this issue is not present in any URLs on your site during the 28-day window, the issue is considered fixed. The presence of that issue in any URL is enough to mark the issue as not fixed; however the status of individual URLs continue to be evaluated for the entire 28 days, regardless of issue status.

Start tracking does not trigger re-indexing or any other active behavior from Google. It just (re)starts the clock on a 4-week monitoring period of CrUX data for your site by Search Console.
  • To see the validation details for an in-progress validation request or for a request that has failed:
    • Click See details in the validation status section of the issue details page.
  • To restart the validation tracking period at any time:
    • Open the validation details page and click Start new validation.
  • If validation fails:
    1. Try again to fix your issues.
    2. Restart the tracking period by opening the validation details page, and clicking Start new validation

Issue validation status

This is the status of the entire validation request, shown for each issue on the summary page, as well as the issue details page.

The following validation statuses are possible:

  • Not started: There are one or more URLs with an instance of this issue that have never been in a validation request.
  • Started:  You have begun a validation attempt and no remaining instances of the issue have been found yet.
  • Looking good: You started a validation attempt, and all issue instances that have been checked so far have been fixed.
  • Passed: All URLs are in Passed state. You must have clicked Validate fix to get to this state (if instances disappeared without you requesting validation, state would change to N/A)
  • N/A: Google found that the issue was fixed on all URLs, even though you never started a validation attempt.
  • Failed: One or more URLs are in Failed state after a validation attempt.

URL validation status

This is the validation status of each URL in the validation progress page. Pending/Passed/Failed are visible during an active validation period; Failed is the only status visible once the period has ended (fixed items are removed from the list after the period has ended).

  • Pending: Google is awaiting enough data to determine whether or not this URL is still affected.
  • Passed: The URL seems not to be affected by this issue any more.
  • Failed: The URL is still affected by the listed issue. 

The Passed and Failed URL statuses can be reached only during a validation tracking period. If the issue appeared and then vanished for a URL outside of a validation request, the URL would simply vanish from the list without a status.

Any URLs that have been removed from the web and have no data in the last 28 days will no longer appear in the validation history or the report.


External testing tools

The Core Web Vitals report links to two external testing tools for additional page tests. The type of tool depends on the type of page:

  • Non-AMP pages: The PageSpeed Insights testing tool reports on the performance of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved. The test shows both live test data and field test data from actual users.
  • AMP pages: The AMP Page Experience Guide provides a comprehensive live test for an AMP page, including Core Web Vitals metrics and Page experience metrics. The test shows both live test data and field test data from actual users.

A link to these tools appears next to example URLs ( Summary page Details table > Click a status row > Click an example URL > Example details pane, hover over a similar URL), but you can also visit these tools and provide the URL yourself.

You can also use an in-browser test tool for Chrome: the Chrome Lighthouse tool.

Was this helpful?
How can we improve it?
Clear search
Close search
Google apps
Main menu
Search Help Center