Check your site's search performance
Google's goal is to return highly relevant results for every query. Search results are returned from our search index. Our search index is constantly evolving as content is added and modified on the web. The changing content, as well as updates to our ranking algorithms, can cause URLs to change position in search results, and possibly, though less likely, be removed.
We understand that these changes can be confusing. If your site is well-linked from others on the web, it's likely that we'll add it again during our next crawl. While we can't guarantee that any page will consistently appear in our index or appear with a particular rank, our Webmaster Guidelines offer helpful tips for maintaining a crawler-friendly site. Following these recommendations should increase the likelihood that your site will show up consistently in our search results.
If your site isn't appearing in Google search results, or it's performing more poorly than it once did, check out these steps to identify and fix potential causes of the problem.
- Check your site is in the Google index
- See if your site has been impacted by a manual spam action
- Make sure Google can find and crawl your site
- Make sure that Google can index your site
- Make sure your content is useful and relevant
Check your site is in the Google index
- Do a site: search
When a webmaster tells us that his or her site has fallen out of our search results, we often find that it's still included. To quickly determine whether your site is still in our index, just perform a Google site search for its entire URL. A search for site:google.com, for instance, returns the following results: http://www.google.com/search?num=100&q=site:google.com Note that you shouldn't include a space between the site: operator and your domain name.
If your site is displayed as a result when you perform a Google site search for your URL, then it's included in our index.
However, if your site used to be indexed and no longer is, it may have been removed for violations of our Webmaster Guidelines. Review the guidelines and then, once you've fixed any issues, submit a reconsideration request.
- Verify that your site ranks for your domain name
Do a Google search for www.[yourdomain].com. If your site doesn't appear in the results, or if it ranks poorly in the results, this is a sign that your site may have a manual spam action for violations of the Webmaster Guidelines. If we find certain problems with your site—for example, malware—we'll let you know via the Message Center. You should also review your site against the Webmaster Guidelines. Once you're sure that any problems have been addressed, submit a reconsideration request.
- Alert us to your new content
If your site is very new, we may not know about it yet. Tell Google about your site. One way to expedite Google's discovery of new pages is to submit a Sitemap. Even if your site is already in the index, Sitemaps are a great way to tell Google about the pages you consider most important.
See if your site has been impacted by a manual spam action
- Check the Manual Actions page
While Google relies on our automated systems to crawl, index, and serve web pages, we are also willing to take manual action to protect the quality of our search results. If your site contains spam or is otherwise in violation of our Webmaster Guidelines, we may take manual action on it, including demoting it or even removing it from our search results altogether. If your site’s ranking is impacting by a manual spam action, we’ll tell you about it on the Manual Actions page of Search Console. (To see this data, you must have added and verified your site.) (To see this data, you must have added and verified your site.)
Make sure Google can find and crawl your site
Crawling is how Googlebot discovers new and updated pages to be added to the Google index. Our crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites, it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.
- Check for crawl errors. The Crawl errors page in Search Console provides details about the URLs in your site that we tried to crawl and couldn't access. Review these errors, and fix any you can. The next time Googlebot crawls your site, it will note the changes and use them to update the Google index.
- Review your robots.txt file. The Test robots.txt tool lets you analyze your robots.txt file to see if you're blocking Googlebot from any URLs or directories on your site.
- Make sure that the URLs haven't been blocked with meta tags.
- If you have recently restructured your site or moved to a new domain, pages that previously performed well may now rank poorly. To avoid this, use 301 redirects ("RedirectPermanent") in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.) For more information about 301 HTTP redirects, please see http://www.ietf.org/rfc/rfc2616.txt.
- Consider creating and submitting a Sitemap. Even if your site is already indexed, Sitemaps are a way to give Google information about your site and the URLs you consider most important. Sitemaps are particularly helpful if your site has dynamic content or other content not easily discoverable by Googlebot, or if your site is new or does not have many links to it.
Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, we process information included in key content tags and attributes, such as title tags and alt attributes. Google can process many types of content. However, while we can process HTML, PDF, and Flash files, we have a more difficult time understanding (e.g. crawling and indexing) other rich media formats, such as Silverlight.
- Check your site's index stats. These stats show how your site is represented in the Google index.
Make sure your content is relevant and useful
- Understand how users are reaching your site by reviewing the Search queries page. The first column shows the Google searches in which your site most often appears. The page also lists the number of impressions, the number of clicks, and the CTR (click-through rate) for each query. This information is particularly useful because it gives you an insight into what users are searching for (the query), and the queries for which users often click on your site. For example, your site may often appear in Google searches for espresso gadgets and coffee widgets, but if your site has a low CTR for this query, it could be because it's not clear to users that your site contains information about coffee widgets. In this case, consider revising your content to make it more compelling and relevant. Avoid keyword stuffing, though, because this can cause your site's ranking to suffer, as well as degrading the user experience for your readers.
- Understand how Google sees your site. The Content Keywords page shows the keywords and phrases other sites use when they link to yours. Understanding how other people see your site can help you figure out how best to target your audience.
- Check the HTML Improvements page in Search Console. Descriptive information in title tags and meta descriptions will give us good information about the content of your site. In addition, this text can appear in search results pages, and useful, descriptive text is more likely to be clicked on by users.
- Tell the world about your site. Incoming links to your site help Google determine your site's relevance to the user's query. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors.
- Check to see if any of your content has been flagged as adult content by turning off SafeSearch. Google's SafeSearch filter eliminates sites that contain pornography and explicit sexual content from search results. While no filter is 100% accurate, SafeSearch uses advanced proprietary technology that checks keywords and phrases, URLs, and Open Directory categories.
- Great image content can be an excellent way to generate traffic. We recommend that when publishing images, you think carefully about creating the best user experience you can, and follow our image guidelines.
There's almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.
Occasionally, fluctuation in search results is the result of differences in our data centers. When you perform a Google search, your query is sent to a Google data center in order to retrieve search results. There are numerous data centers, and many factors (such as geographic location and search traffic) determine where a query is sent. Because not all of our data centers are updated simultaneously, it's possible to see slightly different search results depending on which data center handles your query.