Crawl errors

How to diagnose site errors

Verify site access issues

In a well-operating site, the Site errors section of the Crawl Errors page should show no errors (this is true for the large majority of the sites we crawl). If Google detects any appreciable number of site errors, we’ll try to notify you in the form of a message, regardless of the size of your site.

When you first view the Crawl Errors page, the Site errors section shows a quick status code next to the each of the three categories of DNS, Server connectivity, and robots.txt fetch. If the codes are anything other than a green check mark, you can click the box to see a graph of crawling details for the last 90 days.

High error rates

If your site shows a 100% error rate any of the three categories, it likely means that your site is either down or misconfigured in some way. This could be due to a number of possibilities that you can investigate:

  • Check that a site reorganization hasn't changed permissions for a section of your site.
  • If your site has been reorganized, check that external links still work.
  • Review any new scripts to ensure they are not malfunctioning repeatedly.
  • Make sure all directories are present and haven't been accidentally moved or deleted.
If none of these situations apply to your site, the error rate might just be a transient spike, or due to external causes (someone has linked to non-existent pages), so there might not even be a problem. In any case, when we see an unusually large number of errors for your site, we’ll let you know so you can investigate.

Low error rates

If your site has an error rate less than 100% in any of the categories, it could just indicate a transient condition, but it could also mean that your site is overloaded or improperly configured. You might want to investigate these issues further, or ask about them on our forum. We might alert you even if the overall error rate is very low — in our experience, a well configured site shouldn’t have any errors in these categories.