Search
Clear search
Close search
Google apps
Main menu
true

Block URLs with robots.txt

Test your robots.txt with the robots.txt Tester

The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search.

 

Open robots.txt Tester

 

You can submit a URL to the robots.txt Tester tool. The tool operates as Googlebot would to check your robots.txt file and verifies that your URL has been blocked properly.

Test your robots.txt file

  1. Open the tester tool for your site, and scroll through the robots.txt code to locate the highlighted syntax warnings and logic errorsThe number of syntax warnings and logic errors is shown immediately below the editor. 
  2. Type in the URL of a page on your site in the text box at the bottom of the page.
  3. Select the user-agent you want to simulate in the dropdown list to the right of the text box.
  4. Click the TEST button to test access.
  5. Check to see if TEST button now reads ACCEPTED or BLOCKED to find out if the URL you entered is blocked from Google web crawlers.
  6. Edit the file on the page and retest as necessary. Note that changes made in the page are not saved to your site! See the next step.
  7. Copy your changes to your robots.txt file on your site. This tool does not make changes to the actual file on your site, it only tests against the copy hosted in the tool.

Limitations of the robots.txt Tester tool:

  • Changes you make in the tool editor are not automatically saved to your web server. You need to copy and paste the content from the editor into the robots.txt file stored on your server.
  • The robots.txt Tester tool only tests your robots.txt with Google user-agents or web crawlers, like Googlebot. We cannot predict how other web crawlers interpret your robots.txt file.
Was this article helpful?
How can we improve it?