آپ نے جس صفحے کی درخواست کی ہے وہ فی الحال آپ کی زبان میں دستیاب نہیں ہے۔ آپ صفحے کے نچلے حصے میں دوسری زبان منتخب یا Google Chrome کے پہلے سے شامل ترجمے کی خصوصیت کا استعمال کر کے اپنی پسند کی زبان میں کسی بھی ویب صفحے کا فوری ترجمہ کر سکتے ہیں۔

Core Web Vitals report

Fix poor user experiences on your site

The Core Web Vitals report shows how your pages perform, based on real world usage data (sometimes called field data). 

OPEN REPORT

Understand the report

The Core Web Vitals report shows URL performance grouped by status (Poor, Need improvement, Good), metric type (CLS, INP, and LCP), and URL group (groups of similar web pages).

The report is based on three metrics as measured by actual user data: LCPINP, and CLS. Once a URL group has a threshold amount of data for both LCP and CLS, the URL group's status is its most poorly performing metric. So, for example, if a URL group has poor CLS but good INP, the URL status is "poor."

If a URL group does not have a minimum amount of reporting data for both LCP and CLS, the URL is omitted from the report.

Only indexed URLs can appear in this report. Data is assigned to the actual URL, not the canonical URL, as it is in most other reports).

Remember that data is combined for all requests from all locations. If you have a substantial amount of traffic from a country with, say, slow internet connections, then your performance in general will go down. You can break down your performance by country using BigQuery if you suspect this might be a cause for low performance.

"No data available"

If you see a "No data available" screen, it means either that your property is new in Search Console, or that there is not enough data available in the CrUX report to provide meaningful information for the chosen device type (desktop or mobile).

If your property is new: The CrUX database gathers information about URLs whether or not the URL is part of a Search Console property, but it can take a few days after a property is created to analyze and post any existing data from the CrUX database.

You can run a live performance test for individual URLs using the PageSpeed Insights testing tool, the Chrome Lighthouse tool, or AMP Page Experience Guide (for AMP pages)

Navigating the report

For each platform (mobile or desktop), the report shows a table of URLs that have Poor or Need improvement issues (Why URLs aren't considered good), and another table of URLs with all Good scores for LCP, INP, and CLS (View data about good URLs).

  1. See a chart of general trends for all platforms on the landing page.
  2. Drill down by platform (mobile or desktop) by clicking Open report next to one of the charts.
  3. To see how URLs on your site perform, based on historical user data, toggle the Poor, Need improvement, or Good tabs on the performance chart.
  4. View the list of performance issues in the Why URLs aren't considered good table. Each URL shown is a representative of a different URL group.
  5. Click a URL in the Examples table of the issue details page to see more information about that URL group.

 

Overview page

The overview page of the Core Web Vitals report breaks down the data by the device used to view the URL (Mobile or Desktop). Data is grouped by URL status (Poor, Need improvement, or Good), where the status is that of the worst performing metric for that URL group.

Open the report for a specific device type to see more performance data for that type.

Summary pages for mobile and desktop

The summary report for a platform (mobile or desktop) shows the status and issues for all URL groups on your site for which we have data. Click a row in the details table to learn more about that specific status + issue type combination.

Chart

The tabs above the chart show the current total of URLs (not URL groups) in each status, as well as the number of issues in that status. Toggle the tabs to choose which statuses to show in the chart. The chart shows the count of URLs with a given status on a given day.

Why is the chart total less than the table total?
The chart counts each URL only once, for the slowest issue affecting that URL. The table, in contrast, counts every issue associated with a URL . So if a URL has one Poor and one Need improvement issue, it is counted once as Poor in the chart totals, but is counted in both Poor and Need improvement rows in the table.

 

Table

The table groups URLs into rows by status + issue. Each row shows the validation state, a sparkline showing a simplified timeline of that row, and the number of URLs currently in that status + issue state.

A URL can appear in multiple table rows if it is affected by multiple issues.

Issue details page for mobile and desktop

Click a table row in the top-level summary page for mobile or desktop to open a details page for that (device + status + issue) combination. The details page shows the URLs and other details for the selected issue.

Chart

The issue details chart shows the count of URLs with that status + issue combination on a given day, as well as the total count of URLs currently affected by the selected status + issue.

Table

The issue details table shows a set of example URLs known to be affected by the selected issue. Each example URL is one of a group of similar URLs.

The table includes the following information:

  • URL: Each row in the table represents a group of similar URLs.
  • For non-good status pages: The appropriate column below will be shown, depending on which issue you are examining. Note that a single URL can be affected by multiple issues, but only the column appropriate for the selected issue is shown.
    • Group INP: 75% of page requests took this amount of time or less with regards to responsiveness in the last 28 days.
    • Group LCP: 75% of page requests took this amount of time or less to reach largest contentful paint in the last 28 days.
    • Group CLS: 75% of page requests had this score or less for cumulative layout shift in the last 28 days.

Click an example URL to see some other pages in the same group, as well as additional information about the group, and a link to run an external test. The table has a limit of 200 rows.

Additional information
Click a URL in the Examples table of the issue details page to see more information about the page group represented by that URL, including other URLs in the group, and scores for those group members, if the URL has enough data to show.
You can click a URL in the group to run a PageSpeed Insights test against that URL. However, it's useful to understand a few important differences between PageSpeed Insights and Core Web Vitals information:
  • Core Web Vitals combines data and status into URL groups; PageSpeed Insights generally shows data for individual URLs (unless the URL doesn't have enough information by itself). The statistics for a specific URL in PageSpeed Insights might not match the group results in Core Web Vitals, because an individual URL might be an outlier in its group.
  • Core Web Vitals URLs include URL parameters when distinguishing the page; PageSpeed Insights strips all parameter data from the URL, and then assigns all results to the bare URL.

Finding the status of a specific URL

The report is not designed find the status of a specific URL, but rather to see your site's performance as a whole, and troubleshoot issues affecting multiple pages on your site. If you want to see performance data about a specific URL, use an external test. Although you can drill down on a status and issue and see specific affected URLs, finding a given URL using the Core Web Vitals report can be challenging.

Report data sources

The data for the Core Web Vitals report comes from the CrUX report. The CrUX report gathers anonymized metrics about performance times from actual users visiting your URL (called field data). The CrUX database gathers information about URLs whether or not the URL is part of a Search Console property.

Group status: Poor, Need improvement, Good

The labels Poor, Need improvement, and Good are applied to a URL group for that specific device type. A URL group without threshold data for both LCP and CLS will not be on the report (for example, if the URL only has threshold data for LCP but not CLS, it won't be shown).

The status for a URL group defaults to the slowest status assigned to it for that device type. For example, 

  • A URL on mobile with Poor CLS but Need improvement LCP is labeled Poor on mobile.
  • A URL on mobile with Need improvement LCP but Good CLS is labeled Need improvement on mobile.
  • A URL with Good LCP, INP, and CLS on mobile and Need improvement LCP, INP, and CLS on desktop is Good on mobile and Need improvement on desktop.

 

Status definitions

Here are the performance ranges for each status:

  Good Need improvement Poor
LCP <=2.5s <=4s >4s
INP <=200ms <=500ms >500ms
CLS <=0.1 <=0.25 >0.25

 

  • LCP (largest contentful paint): The amount of time to render the largest content element visible in the viewport, from when the user requests the URL. The largest element is typically an image or video, or perhaps a large block-level text element. This metric is important because it indicates how quickly a visitor sees that the URL is actually loading.
    • Group LCP shown in the report is the time it takes for 75% of the visits to a URL in the group to reach the LCP state.
  • INP (interaction to next paint): A metric that assesses a page's overall responsiveness to user interactions by observing the time that it takes for the page to respond to all click, tap, and keyboard interactions that occur throughout the lifespan of a user's visit to a page. The final INP value is the longest interaction observed, ignoring outliers. 
    • Group INP shown in the report means that 75% of visits to a URL in this group had this value or better.
  • CLS (Cumulative Layout Shift): CLS measures the sum total of all individual layout shift scores for every unexpected layout shift that occurs during the entire lifespan of the page. The score is zero to any positive number, where zero means no shifting and the larger the number, the more layout shift on the page. This is important because having pages elements shift while a user is trying to interact with it is a bad user experience. If you can't seem to find the reason for a high value, try interacting with the page to see how that affects the score.
    • Group CLS shown in the report is the lowest common CLS for 75% of visits to a URL in the group.

You can find recommendations on fixing these issues by running an external test.

URL groups

URLs in the report are grouped into pages that have a similar user experience. The LCP, INP, and CLS status applies to the entire group. Some outlier URLs might have better or worse values on some visits, but 75% of visits to all URLs in the group experienced the group status shown. It is assumed that these groups have a common framework and the reasons for any poor behavior of the group will likely be caused by the same underlying reasons.

In order to respect user privacy, a URL group must have a minimum amount of data to be shown in the report. If a URL group doesn't have enough information to display in the report, Search Console creates a higher-level origin group that should contain enough URLs and data to show in the report. This origin group contains data for all URLs in the same protocol://host:port group. For example, if the URL https://m.example.com/a/b/c.html is part of a group that doesn't have enough data to show, then Search Console will create the origin group https://m.example.com. This origin group contains data for all URLs under https://m.example.com, whether or not the URL also belongs to a group with sufficient data.

A few notes:

  • The origin group definition includes the protocol; thus http://m.il.example.com and https://m.il.example.com are separate origin groups.
  • The origin group contains data for all URLs below that origin, whether or not a URL is part of another group shown in the report.
  • If the origin group doesn't have enough data, then it won't be shown (and, by extension, the site won't have enough data to show in this report, unless there are multiple origin groups).
  • You can view data for the origin group whether or not the group is within the current property. However you can view only example URLs that are within the current property.
  • Search Console lists group members ordered by impressions, in descending order.

Fix issues

Non-technical users

  1. Prioritize your issues: We recommend fixing everything labeled "Poor" first, then prioritize your work either by issues that affect the most URLs, or by issues that affect your most important URLs. URLs labeled "Need improvement" could be improved, but are less important to fix than Poor URLs.
  2. When you've sorted by priority, share the report with your engineer or whomever will be updating your URLs.
  3. Common page fixes:
    • Reduce your page size: best practice is less than 500KB for a page and all its resources.
    • Limit a page to 50 resources for best performance on mobile.
    • Use an external test to recommend fixes to your page.
  4. Test your fixes using an external test.
  5. When you think a particular issue is fixed, click Start Tracking on the issue details page in the Search Console Core Web Vitals report.
  6. Track your validation process.

Website developers

  1. Prioritize your issues: We recommend fixing everything labeled "Poor" first. URLs labeled "Need improvement" could be improved, but are less important to fix than Poor URLs. Within a status, prioritize issues either by those that affect the most URLs, or those that affect your most important URLs.
  2. URLs shown in a given group are sorted by impression, descending, so the URLs at the top have the most affect on the group status. Fix URLs in the order shown for the greatest effect on your status, but we recommend fixing as many URLs as you can. Note that if a group is near the edge of a status, then status might be affected by a few URLs in the group far down the list.
  3. We recommend reading the web.dev fast loading guidelines and the Web Fundamentals performance pages on developers.google.com for theory and guidelines on improving page speed.
  4. Use an external test to recommend fixes to your page.
  5. Test your fixes using an external test.
  6. When you think a particular issue is fixed, click Start Tracking on the issue details page in the Search Console Core Web Vitals report.
  7. Track your validation process.

Additional useful resources:

My site status changed, but I didn't change anything

If you didn't make any changes in your site, but you see a big change in status for a lot of pages, it's possible that you had a borderline status for many pages, and some site-wide event pushed your pages over the edge: for example, your site traffic dramatically increased or the service that serves your image files experienced a latency change, either of which could slow your site down. A small, but site-wide, change might have been just enough to push a bunch of borderline Good pages into the Need improvement category, or from Need improvement to Poor.

Another possible, though less likely, reason is a large-scale change in clients. For example, a widely-adopted browser version update, or an influx of users over a slower network. Remember that performance is measured by actual usage data. You can check your logs to see if any browser, device, or location changes coincide with site status changes.

Check your site traffic data during this period for any big swings, and also drill down into specific issues and look at the group LCP/INP/CLS numbers for affected pages. If these numbers are just at the border for Poor/Need improvement/Good, it's possible that a small change has nudged them into a new status.

 

Sharing the report

You can share issue details in the coverage or enhancement reports by clicking the Share button on the page. This link grants access only to the current issue details page, plus any validation history pages for this issue, to anyone with the link. It does not grant access to other pages for your resource, or enable the shared user to perform any actions on your property or account. You can revoke the link at any time by disabling sharing for this page.

Exporting report data

Many reports provide an export button to export the report data. Both chart and table data are exported. Values shown as either ~ or - in the report (not available/not a number) will be zeros in the downloaded data.

Validate fixes

When you've fixed a specific issue in all of your URLs, you can confirm whether you fixed the issue for all URLs. Click Start Tracking to start a 28-day monitoring session to check for instances of this issue in your site. If this issue is not present in any URLs on your site during the 28-day window, the issue is considered fixed. The presence of that issue in any URL is enough to mark the issue as not fixed; however the status of individual URLs continue to be evaluated for the entire 28 days, regardless of issue status.

Start tracking does not trigger re-indexing or any other active behavior from Google. It just (re)starts the clock on a 4-week monitoring period of CrUX data for your site by Search Console.
  • To see the validation details for an in-progress validation request or for a request that has failed:
    • Click See details in the validation status section of the issue details page.
  • To restart the validation tracking period at any time:
    • Open the validation details page and click Start new validation.
  • If validation fails:
    1. Try again to fix your issues.
    2. Restart the tracking period by opening the validation details page, and clicking Start new validation.

Issue validation status

This is the status of the entire validation request, shown for each issue on the summary page, as well as the issue details page.

The following validation statuses are possible:

  • Not started: There are one or more URLs with an instance of this issue that have never been in a validation request.
  • Started: You have begun a validation attempt and no remaining instances of the issue have been found yet.
  • Looking good: You started a validation attempt, and all issue instances that have been checked so far have been fixed.
  • Passed: All URLs are in Passed state. You must have clicked Validate fix to get to this state (if instances disappeared without you requesting validation, state would change to N/A)
  • N/A: Google found that the issue was fixed on all URLs, even though you never started a validation attempt.
  • Failed: One or more URLs are in Failed state after a validation attempt.

URL validation status

This is the validation status of each URL in the validation progress page. Pending/Passed/Failed are visible during an active validation period; Failed is the only status visible once the period has ended (fixed items are removed from the list after the period has ended).

  • Pending: Google is awaiting enough data to determine whether or not this URL is still affected.
  • Passed: The URL seems not to be affected by this issue any more.
  • Failed: The URL is still affected by the listed issue.

The Passed and Failed URL statuses can be reached only during a validation tracking period. If the issue appeared and then vanished for a URL outside of a validation request, the URL would simply vanish from the list without a status.

Any URLs that have been removed from the web and have no data in the last 28 days will no longer appear in the validation history or the report.

 

External testing tools

The Core Web Vitals report links to two external testing tools for additional page tests. The type of tool depends on the type of page:

  • Non-AMP pages: The PageSpeed Insights testing tool reports on the performance of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved. The test shows both live test data and field test data from actual users. Note that the information in PageSpeed Insights might vary from the information in the Core Web Values report. Learn why.
  • AMP pages: The AMP Page Experience Guide provides a comprehensive live test for an AMP page, including Core Web Vitals metrics. The test shows both live test data and field test data from actual users.

A link to these tools appears next to example URLs ( Summary page Details table > Click a status row > Click an example URL > Example details pane, hover over a similar URL), but you can also visit these tools and provide the URL yourself.

You can also use an in-browser test tool for Chrome: the Chrome Lighthouse tool.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
1679349683670506618
true
Search Help Center
true
true
true
true
true
83844
false
false