Dec 4, 2020
Does Google think we are Spam?
Context:
I work on Trustedreviews.com and we have seen very weird anomalies across our site, we’re an independent business but we weren’t 2 years ago when we first got hit around the time of the Google E-A-T update.
From that point it has just been traffic loss after traffic loss after the Google Updates.
- It all started at the time of the E-A-T update. Since then we have made strides to improve the site to the best of our ability which required huge investments.
- Our current trust pilot score for example is 4 out of 5 compared to 1.5 and 2 stars for the sites that Google is rewarding.
- End of November 2019 there was an unannounced update and we lost 35% from the traffic levels overnight
- If I compare this year to the lower period last year we have lost a further 30% traffic over the year which feels more like a slow burn
- Numerous Negative SEO attacks
- All in the all the site has lost a business crippling 80% of its traffic in 3 years
- It looks like the December 2020 update is showing the same trend which started 3 days before the update was actually announced
Our editorial guidelines are very clear, we only publish a review of products that we have actually been able to use and test unlike a lot of our competitors. Being a brand that is over 15 years old we have great access to tech brands and exclusives which makes us newsworthy and relevant to our audience.
All traffic that we were previously getting has migrated and moved to the benefit of one or two other sites in the US and UK. Sometimes we even use the same freelancers as these brands, which is a common occurrence in the tech niche. Staff have also moved between the companies so the content quality and the general training of the journalists is generally the same.
What has been done:
Over the last year, we have changed the way we work but also improved the fundamentals on the site.
Some of the largest names in SEO have audited the site and all of their suggestions have been implemented.
We have pruned a lot of the older and less relevant content. We have also consolidated a lot of older content that sat across multiple pages. We have streamlined our top navigation and worked on being more streamlined in our pagination.
This has resulted in a reduction in the number of “low-quality” crawlable pages by around 60%. Low quality being defined as not providing longer term values to our user or having no longer term brand value.
On top of this we have invested heavily in backend improvements of our CMS. This included a lot of work to improve performance on the site. We took our performance score, as measured by Lighthouse, on mobile from below 20 to over 90 in most cases. Which also looked to improve CLS, LCP and FID, which we think are now best in class for our niche.
What are we currently seeing:
The site is built on WordPress and I am seeing some weird stuff reported in GSC. I don't believe this stuff is helping our site and it doesn't seem like it is something we can control. One of these is Googlebot picking up core WordPress URLs that aren't used in our site structure -
Sample from November 15:
In itself, this isn't a problem. The redirects are handled properly but I keep seeing these as the referring URLs in the Inspect URL tool. I have no clue where Google is finding these links and they aren't part of our site structure.
Another issue is that Google is picking up none live subdomains as referring pages. If I inspect one of the ?p= URLs mentioned above I get correctly redirected to - https://www.trustedreviews.com/how-to/how-to-screen-record-on-iphone-3294586. If I inspect that page in GSC I get the referring page of https://live.trustedreviews.com/how-to/how-to-screen-record-on-iphone-3294586 which was last crawled on 18/11/2020.
I am also seeing Google hanging onto our AMP pages even though they have been removed and the cache cleared.
We have a history of negative SEO attacks where someone has been creating search pages and linking to them. We have tried to disavow as many of these as possible. These URLs often have Japanese or Chinese characters and are being indexed by Google. Looking at this it seems clear that Google is spending more time crawling junk URLs that other people are creating than the good stuff we create.
As a benchmark I have been using Bing traffic to keep track of search fundamentals. Bing traffic has remained stable so for me this is certainly something that Google Specifically dislikes about our site.
Given the amount of traffic losses we have experienced over the past 2 years I don’t believe that this is a situation where “there is nothing you can do”. We have performed a lot of work on the site, at a large cost, and have looked to make improvements that centred around improving the site for the user. There is obviously something that the Google Algo doesn’t like about our site specifically and in these times I see a manual action as a positive outcome.
Community content may not be verified or up-to-date. Learn more.
All Replies (17)