Access denied errors

In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons:

  • Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content.
  • Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories.
    • Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.)
    • The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results.
  • Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site.