Matt Cutts talks about manual action on webspam
While Google relies on algorithms to evaluate and constantly improve search quality, we're also willing to take manual action on sites that use spammy techniques, such as demoting them or even removing them from our search results altogether. The Manual Actions page lists these actions (and potentially others) and links to steps you can take to address the problem. (If your site’s ranking is impacted by a manual spam action, we'll also notify you in the Message Center in Search Console.)
If your site isn't appearing in search results, or isn't performing as well as it once did, we recommend checking the Manual Actions report and taking steps to address the problem. Once you're satisfied that your site follows the Webmaster Guidelines, you can request a review of your site directly from the Manual Actions report. The web is an ever-changing ecosystem, and your site's performance in search almost certainly fluctuates over time. As a result, even if your reconsideration request is successful, your site may rank lower (or higher) than it used to.
Two types of actions are displayed on the Manual Actions page.
- The Site-wide matches section lists actions that impact an entire site.
- The Partial matches section lists actions that impact individual URLs or sections of a site. It's not uncommon for pages on a popular site to have manual actions, particularly if that site serves as a platform for other users or businesses to create and share content. If the issues appear to be isolated, only individual pages, sections, or incoming links will be impacted, not the entire site.
Each section includes the following information:
- Reason: The reasons for each action Google has applied.
- Affects: The parts of the site affected by each manual action. If a manual action affects more than 1,000 URLs or sections, only the first 1,000 matches will be displayed.
Here are some common types of manual actions:
About Google and spam
Google is constantly working to improve search. We take a data-driven approach and employ analysts, researchers and statisticians to evaluate search quality on a full-time basis. Changes to our algorithms undergo extensive quality evaluation before being released. More information about our algorithm.
Ever since there have been search engines, there have been people dedicated to tricking their way to the top of the results page. This is bad for searchers because it means more relevant websites get buried under irrelevant results, and it’s bad for legitimate website owners because their sites become harder to find. For these reasons, we’ve been working since the earliest days of Google to fight spammers, helping people find the answers they’re looking for, and helping legitimate websites get traffic from search.
Our algorithms are extremely good at detecting spam, and in most cases we automatically discover it and remove it from our search results. However, to protect the quality of our index, we're also willing to take manual action to remove spam from our search results.