See the importance of crawling & rendering

It’s important to make sure that your site can be crawled and rendered properly so you can get the most out of Google Search. The rest of this document describes some reasons you might want to block content on your web pages to improve the crawling and rendering of your site.

Make sure your site can be crawled by Google

Verify that Googlebot and other Google web crawlers can reach your site on a network level. It is important that the URLs you want Google to display in Search results can be first found by Google. Sometimes URLs are blocked purposefully by site owners from Google but, before blocking your URLs, you should check that this doesn’t hide content you want to come up in your site Google Search results.

To learn more about the reasons Google might not crawl your site, see the Make sure Googlebot is not blocked article.

Help Google get a better picture of your site

Help Googlebot get all the resources you reference on your web pages. Google takes into account your non-textual content and overall visual layout to decide where you appear in Search results. The visual aspects of your site helps us fully digest or understand your web pages. If Google can better understand your site, we can match your site to the people who are looking to find your content.

When Googlebot retrieves your pages, Googlebot runs your code, and assess your content to understand the layout or structure of your site. All information Google collects during the rendering process is then used to rank the quality and value of your site content against other sites and what people are searching for with Google Search.

Dynamic Content

If you have web pages that use code to arrange or display your content, Google must be able to properly render your content to get it into Google Search. Often the core textual content of a dynamic website can only be retrieved by rendering the pages so Google can see your site just like anyone else on a web browser would. If your site experiences faulty rendering, Google might be unable to get any of your content.

A web page relies on the availability of my_script.js, which is typically run by web browsers to provide the browsers with the core textual content of the page. If my_script.js is blocked from Google, we won’t be able to get the text content when Googlebot renders the web page.


Was this helpful?
How can we improve it?