Change Googlebot crawl rate

The term crawl rate means how many requests per second Googlebot makes to your site when it is crawling it: for example, 5 requests per second.

You cannot change how often Google crawls your site, but if you want Google to crawl new or updated content on your site, you can request a recrawl.

Proper usage

Google has sophisticated algorithms to determine the optimal crawl speed for a site. Our goal is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth.

If Google is making too many requests per second to your site and slowing down your server, you can limit how fast Google crawls your site.

You can limit the crawl rate for root-level sites—for example, www.example.com and http://subdomain.example.com. The crawl rate that you set is the maximum crawl rate that Googlebot should make. Note that it does not guarantee that Googlebot will reach this maximum rate.

We recommend against limiting the crawl rate unless you are seeing server load problems that are definitely caused by Googlebot hitting your server too hard.

You cannot change the crawl rate for sites that are not at the root level—for example, www.example.com/folder.

Limit the crawl rate

  1. Open the Crawl Rate Settings page for your property.
    • If your crawl rate is described as "calculated as optimal," the only way to reduce the crawl rate is by filing a special request. You cannot increase the crawl rate.
    • Otherwise, select the option you want and then limit the crawl rate as desired. The new crawl rate will be valid for 90 days.

Emergency crawl limitation

If your site is being crawled so heavily that your site is having availability issues, here is how to protect it:

  1. Determine which Google crawler is overcrawling your site. Look at your website logs or use the Crawl Stats report.
  2. Immediate relief:
    • If you want a simple solution, use robots.txt to block crawling for the overloading agent (googlebot, adsbot, etc.). However, this can take up to a day to take effect. But don't block for too long, as this can have long-term effects on your crawling.
    • If you're able to detect and respond to increased load dynamically, return HTTP 503/429 when you're nearing your serving limit. Be sure not to return 503 or 429 for more than two or three days, though, or it can signal Google to crawl your site less frequently in the long term.
  3. Change your crawl rate using the Crawl Rate Settings page, if the option is available.
  4. Two or three days later, when Google's crawl rate has adapted, you can remove your robots.txt blocks or stop returning 503 or 429 error codes.
  5. If you are being overwhelmed by AdsBot crawls, the problem is likely that you have created too many targets for Dynamic Search Ads on your site using URL_Equals or page feeds. If you don't have the server capacity to handle these crawls you should either limit your ad targets, add URLs in smaller batches, or increase your serving capacity. Note that AdsBot will crawl your pages every 2 weeks, so you will need to fix the issue or it will recur.
  6. Note that if you've limited the crawl rate using the crawl settings page, the crawl rate will return to automatic adjustment after 90 days.

Was this helpful?

How can we improve it?
true
New to Search Console?

Never used Search Console before? Start here, whether you're a complete beginner, an SEO expert, or a website developer.

Search
Clear search
Close search
Google apps
Main menu
16201974263605835540
true
Search Help Center
true
true
true
true
true
83844