What is an Access denied error?
In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons:
- Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content.
robots.txtfile is blocking Google from accessing your whole site or individual URLs or directories.
- Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site.
How to deal with Access denied errors
- Test that your
robots.txtis working as expected and does not block Google. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot.
- Use Fetch as Google to understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results.