If your page is blocked to Google by a robots.txt rule, it probably won't appear in Google Search results, and in the unlikely chance it does, the result won't include a description.
1. Confirm that a page is blocked by robots.txt
If you have verified your site ownership in Search Console:
- Open the URL Inspection tool.
- Inspect the URL shown for the page in the Google search result. Make sure that you've selected the Search Console property that contains this URL.
- In the inspection results, check the status of the Page indexing section. If it says Blocked by robots.txt, then you've confirmed the problem. Move to the next section to fix it.
If you have not verified your site in Search Console:
- Search for a robots.txt validator.
- In the validator, enter the URL of the page that is missing the description. Use the URL shown for the page in Google search results.
- If the validator say that the page is blocked to Google, you've confirmed the problem. Move to the next section to fix it.
2. Fix the rule
- Use a robots.txt validator to find out which rule is blocking your page, and where your robots.txt file is.
- Fix or remove the rule:
- If you are using a website hosting service—for example, if your site is on Wix, Joomla, or Drupal—we can't provide exact guidance how to update your robots.txt file because every hosting service has its own way to do this. Search your hosting provider's documentation to learn how to unblock your page or site to Google. Suggested terms to search for: "robots.txt provider_name" or "unblock page to Google provider_name". Example search: robots.txt Wix
- If you're able to modify your robots.txt file directly, remove the rule, or else update it according to robots.txt syntax.