Access denied errors

What is an Access denied error?

In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons:

  • Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content.
  • Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories.
  • Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site.

How to deal with Access denied errors

  • Test that your robots.txt is working as expected and does not block Google. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. 
  • Use Fetch as Google to understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results.

Was this helpful?

How can we improve it?
true
New to Search Console?

Never used Search Console before? Start here, whether you're a complete beginner, an SEO expert, or a website developer.

Search
Clear search
Close search
Main menu
10234552710777798059
true
Search Help Center
true
true
true
true
true
83844
false
false