bookmark_borderbookmark
Stay organized with collections
Save and categorize content based on your preferences.
The topics in this section describe how you can control Google's ability to find and parse
your content in order to show it in Search and other Google properties, as well as how to
prevent Google from crawling specific content on your site.
Here's a brief description of each page. To get an overview of crawling and indexing, read
our How Search works guide.
Learn what URL canonicalization is and how to tell Google about any duplicate pages on
your site in order to avoid excessive crawling. Learn how Google auto-detects duplicate
content, how it treats duplicate content, and how it assigns a canonical URL to
any duplicate page groups found.
There are some differences and limitations that you need to account for when designing your
pages and applications to accommodate how crawlers access and render your content.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-02-04 UTC."],[[["This section explains how to control which content Google can find, parse, and show in search results."],["Learn how to submit new or updated pages, manage crawl rate, and block specific content from Google."],["Discover best practices for URL structure, sitemaps, canonicalization, and mobile optimization."],["Understand how Google handles various file types, JavaScript, and metadata for indexing."],["Explore tools for removing content, managing site moves, and optimizing for AMP."]]],["This content details how to manage Google's crawling and indexing of website content. Key actions include: using sitemaps to inform Google of new/updated pages, structuring URLs logically, and utilizing `robots.txt` to specify crawlable content. You can ask Google to recrawl URLs, manage faceted navigation, and handle duplicate content via canonicalization. The content also covers optimizing for mobile, AMP, JavaScript, metadata, removals, and site changes, including redirects and temporary site pauses. It also lists the indexable file types.\n"]]