Crawl errors

Access denied errors

What is an Access denied error?

In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons:

  • Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content.
  • Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories.
  • Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site.

How to deal with Access denied errors

  • Test that your robots.txt is working as expected and does not block Google. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. 
  • Use Fetch as Google to understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results.

How helpful is this article:

Feedback recorded. Thanks!
  • Not at all helpful
  • Not very helpful
  • Somewhat helpful
  • Very helpful
  • Extremely helpful