Google user
Original Poster
Dec 1, 2020
Robots.txt Directives
We have an issue in which we want Googlebot to stop crawling UTM parameters, however, when we blocked the UTM parameters with the 'Disallow' directive in our robots.txt file, it meant Adsbot could not crawl our UTMs either. Adsbot then was unable to crawl the destination URLs meaning our ads got disapproved.
We need to be able to allow adsbot to crawl UTMs in the robots.txt file but obviously disallow Googlebot.
This is not as simple as it sounds because, when we add a separate group calling out Adsbot specifically Eg. User-agent: AdsBot-Google
It means that Adsbot ignores all of the other directives that came before it with the wild card *. E.g.
User-agent: *
Is there a way to go about allowing adsbot to crawl urls but blocking Googlebot?
Thanks
Community content may not be verified or up-to-date. Learn more.
All Replies