Search
Clear search
Close search
Google apps
Main menu

Blocked Resources Report

Which parts of my pages are blocked by robots.txt directives?

Googlebot needs access to many resources on your page in order to render and index the page optimally. For example, JavaScript, CSS, and image files should be available to Googlebot so that it can see pages like an average user.

If a site's robots.txt file disallows crawling these resources, it can affect how well Google renders and indexes the page, which can affect the page's ranking in Google search.

How to read the report

This report shows resources used by your site that are blocked to Googlebot. Not all resources are shown, only resources that we think might be under your control.

  1. The report landing page shows a list of hosts that provide resources on your site that are blocked by robots.txt rules. Some resources will be hosted on on your own site, and some will be hosted on other sites.
  2. Click on any host in the table to see a list of blocked resources from that host, with a count of pages on your site affected by each blocked resource.
  3. Click on any blocked resource in the table for a list of your pages that load the resource.
  4. Click on any page in the table hosting a blocked resource for instructions on how to unblock that resource, or else follow the instructions below under How to unblock your resources.

You can navigate backward up this path by clicking the appropriate link in the "breadcrumb path" that appears at the top of the page as you click deeper: Blocked resources > Blocking host > Blocked resource name.

How to unblock your resources

To unblock your resources:

  1. Open the Blocked Resources Report to find a list of hosts of blocked resources on your site. Start with the hosts that you own, since you can directly update the robots.txt files, if needed. You might not have control of all the hosts that we list, but change the ones that you can.
  2. Click a host on the report to see a list of blocked resources from that host. Go through the list and start with those that might affect the content and layout in a meaningful way. Less important resources, such as tracking pixels or counters, aren't worth bothering with.
  3. For each resource that affects layout, click to see a list of your pages that uses it. Click on any page in the list and follow the pop-up instructions for viewing the difference and updating the blocking robots.txt file. Fetch and render your page after each change to verify that the resource is now appearing.
  4. Continue updating resources for a host until you've enabled Googlebot access to all the important blocked resources.
  5. Move on to hosts that you don't own, and if the resources have a strong visual impact, either contact the webmaster of those sites to ask them to consider unblocking the resource to Googlebot, or consider removing your page's dependency on that resource.
Was this article helpful?
How can we improve it?