Dec 5, 2021

how to stop googlebot attacks on websites?

Hello Please help,

This ip 66.249.75.54 attacks my website continuously for 3 days until now, until my hosting server went down.
after I checked the IP belongs to googlebot.

- I've blocked that IP.
- I have enabled Ddos attack feature in cloudflare.
- I've set googlebot disallow in robots.txt.
but that ip still crawling my website.

how to stop this attack?
Locked
Informational notification.
This question is locked and replying has been disabled.
Community content may not be verified or up-to-date. Learn more.
Recommended Answer
Dec 6, 2021
Hi, 

What do you mean attack?   that IP is al legitimate ip of google bot .

if the bot is crawling your site to fast and causing server load issue, you can request crawl speed change at Change Googlebot crawl rate - Search Console Help

if you see the bot is cralwing the same urls over and over  you can file a report at Search Console (google.com)
Diamond Product Expert barryhunter recommended this
Helpful?
All Replies (6)
Recommended Answer
Dec 6, 2021
Hi, 

What do you mean attack?   that IP is al legitimate ip of google bot .

if the bot is crawling your site to fast and causing server load issue, you can request crawl speed change at Change Googlebot crawl rate - Search Console Help

if you see the bot is cralwing the same urls over and over  you can file a report at Search Console (google.com)
Diamond Product Expert barryhunter recommended this
Dec 7, 2021
Hi thanks in advance for the answer,

what I mean by attacking is that ip crawls my site too fast per 1 second, thus making my hosting exceed the resource limit.

I didn't register my website in Google Search Console, then how do I change the crawl speed?
Dec 7, 2021
If you not willing to do it via Search Console, then probably the main way is by responding with 503s. 
 
That is pretty clear signla to Gogole that the server is struggling and it should slow down crawling. 
Dec 7, 2021
what is responding with 503s? how to apply it?
Dec 7, 2021
Fix the server so that it repsonds with a 503 status when its overlaoded. 
 
Or maybe can do it when it getting 'close' to being overloaded. Ie get Googel to back down BEFORE it actully overwealms the server
 
MOre here:
 
 
 
Dec 8, 2021
@Nikki, 

Speak to your ISP.  1 crawl per second should not break the server.   

If you do not want the website to be crawled at all by google, you could also block crawling completley using a robots.txt file.   see Create and submit a robots.txt file | Google Search Central
false
17781449235461361290
true
Search Help Center
true
true
true
true
true
83844
Search
Clear search
Close search
Main menu
false
false