Server errors

Googlebot couldn’t access your URL, the request timed out, or your site was busy. As a result, Googlebot was forced to abandon the request.

Excessive page load times, leading to timeouts, can be due to the following:

  • Dynamic pages taking too long to respond. If the server is busy, it may have returned an overloaded status to ask the Googlebot to crawl the site more slowly. In general, we recommend keeping parameters short and using them sparingly. If you’re confident about how parameters work for your site, you can tell Google how we should handle these parameters.
  • Your site's hosting server is down, overloaded, or misconfigured. If the problem persists, check with your webhoster, and consider increasing your site’s ability to handle traffic.

Your site may also be deliberately or inadvertently blocking Google. In general, this can be the result of a DNS configuration issue or, in some cases, a misconfigured firewall or DoS protection system (sometimes the site's content management system). Protection systems are an important part of good hosting and are often configured to automatically block unusually high levels of server requests. However, because Googlebot often makes more requests than a human user, it can trigger these protection systems, causing them to block Googlebot and prevent it from crawling your website.

To fix such issues, identify which part of your website’s infrastructure is blocking Googlebot and remove the block. The firewall may not be under your control, so you may need to discuss this with your hosting provider.

Some webmasters intentionally prevent Googlebot from reaching their websites, perhaps using a firewall as described above. In these cases, usually the intent is not to entirely block Googlebot, but to control how the site is crawled and indexed. In this case, check the following: