Unreachable: robots.txt

Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. However, your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file. To make sure we didn't crawl any pages listed in that file, we postponed our crawl.

Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.

Note: If the contents of the robots.txt file is different in your browser than what Google sees, work with your hosting company to remove any server rules that might show different robots.txt content to different user agents.

Was this helpful?

How can we improve it?
true
New to Search Console?

Never used Search Console before? Start here, whether you're a complete beginner, an SEO expert, or a website developer.

Search
Clear search
Close search
Google apps
Main menu
17541606491730520417
true
Search Help Center
true
true
true
true
true
83844