The Crawl Errors page provides details about the URLs in your site that Google could not successfully crawl or that returned an HTTP error code.
View crawl errors:
- On the Webmaster Tools Home page, click the site you want.
- Click Health, and then click Crawl Errors.
This page lists two types of errors.
- Site errors: This section lists errors that prevent Googlebot from accessing your site at all.
- URL errors: This section lists errors Googlebot encountered when trying to crawl specific URLs. You can search for specific URLs or errors.
In addition to listing any URLs we had difficulty crawling, we'll list the type of problem and, where possible, list the page or pages on which we found the error. The most important URLs are listed first.
To see more information about a specific error, click the listed URL. Then:
- To see a (possibly incomplete) list of Sitemaps including this URL, click In these Sitemaps.
- To see a (possibly incomplete) list of source pages containing the URL, click Linked from these pages.
- To see the content of the page as Google sees it, click Fetch as Google. Fetch as Googlebot is a useful tool for troubleshooting problems with your pages.
Optional: If you've addressed the issue causing an error for a specific URL, you can hide that URL from the list. Select the checkbox next to the URL, then click Mark as fixed. The URL will be removed from the list. (Note, however, that if the issue remains unresolved, the URL will reappear in the list the next time it is crawled by Googlebot.)
Note: We can't list information about URLs we haven't recently crawled.