/webmasters/community?hl=en
This content is likely not relevant anymore. Try searching or browse recent questions.
Does Google think we are Spam? 1 Recommended Answer 17 Replies 70 Upvotes
1 Update
$0 Updates
1 Recommended Answer
$0 Recommended Answers
1 Relevant Answer
$0 Relevant Answers
1
Context:

I work on Trustedreviews.com and we have seen very weird anomalies across our site, we’re an independent business but we weren’t 2 years ago when we first got hit around the time of the Google E-A-T update.

From that point it has just been traffic loss after traffic loss after the Google Updates.

  • It all started at the time of the E-A-T update. Since then we have made strides to improve the site to the best of our ability which required huge investments.
  • Our current trust pilot score for example is 4 out of 5 compared to 1.5 and 2 stars for the sites that Google is rewarding.
  • End of November 2019 there was an unannounced update and we lost 35% from the traffic levels overnight
  • If I compare this year to the lower period last year we have lost a further 30% traffic over the year which feels more like a slow burn
  • Numerous Negative SEO attacks
  • All in the all the site has lost a business crippling 80% of its traffic in 3 years
  • It looks like the December 2020 update is showing the same trend which started 3 days before the update was actually announced

Our editorial guidelines are very clear, we only publish a review of products that we have actually been able to use and test unlike a lot of our competitors. Being a brand that is over 15 years old we have great access to tech brands and exclusives which makes us newsworthy and relevant to our audience. 

All traffic that we were previously getting has migrated and moved to the benefit of one or two other sites in the US and UK. Sometimes we even use the same freelancers as these brands, which is a common occurrence in the tech niche. Staff have also moved between the companies so the content quality and the general training of the journalists is generally the same.

What has been done: 

Over the last year, we have changed the way we work but also improved the fundamentals on the site. 

Some of the largest names in SEO have audited the site and all of their suggestions have been implemented. 

We have pruned a lot of the older and less relevant content. We have also consolidated a lot of older content that sat across multiple pages. We have streamlined our top navigation and worked on being more streamlined in our pagination. 

This has resulted in a reduction in the number of “low-quality” crawlable pages by around 60%. Low quality being defined as not providing longer term values to our user or having no longer term brand value. 

On top of this we have invested heavily in backend improvements of our CMS. This included a lot of work to improve performance on the site. We took our performance score, as measured by Lighthouse, on mobile from below 20 to over 90 in most cases. Which also looked to improve CLS, LCP and FID, which we think are now best in class for our niche.

What are we currently seeing: 

The site is built on WordPress and I am seeing some weird stuff reported in GSC. I don't believe this stuff is helping our site and it doesn't seem like it is something we can control. One of these is Googlebot picking up core WordPress URLs that aren't used in our site structure -

Sample from November 15:

In itself, this isn't a problem. The redirects are handled properly but I keep seeing these as the referring URLs in the Inspect URL tool. I have no clue where Google is finding these links and they aren't part of our site structure. 

Another issue is that Google is picking up none live subdomains as referring pages. If I inspect one of the ?p= URLs mentioned above I get correctly redirected to - https://www.trustedreviews.com/how-to/how-to-screen-record-on-iphone-3294586. If I inspect that page in GSC I get the referring page of https://live.trustedreviews.com/how-to/how-to-screen-record-on-iphone-3294586 which was last crawled on 18/11/2020.

I am also seeing Google hanging onto our AMP pages even though they have been removed and the cache cleared.

We have a history of negative SEO attacks where someone has been creating search pages and linking to them. We have tried to disavow as many of these as possible. These URLs often have Japanese or Chinese characters and are being indexed by Google. Looking at this it seems clear that Google is spending more time crawling junk URLs that other people are creating than the good stuff we create. 

As a benchmark I have been using Bing traffic to keep track of search fundamentals. Bing traffic has remained stable so for me this is certainly something that Google Specifically dislikes about our site. 

Given the amount of traffic losses we have experienced over the past 2 years I don’t believe that this is a situation where “there is nothing you can do”. We have performed a lot of work on the site, at a large cost, and have looked to make improvements that centred around improving the site for the user. There is obviously something that the Google Algo doesn’t like about our site specifically and in these times I see a manual action as a positive outcome.
Relevant Answer Relevant Answers (0)
All Replies (17)
Relevant Answer
Doesn’t sound right to me. One for Danny I think
marked this as an answer
Relevant Answer
The referring links sounds like an issue that you really want to look into more.  Have you set up those subdomains as GSC profiles to see what the stats tell you?  Could you - or perhaps a CDN - be selectively serving something you don't want served?

Unrelated to referrers, do you publish any syndicated content?  Or do you writers provide similar articles to multiple publishers?
marked this as an answer
Relevant Answer
marked this as an answer
Relevant Answer
Hi Richard,

Thanks for your reply. 

I think the links will be a focus for the next quarter. If there is any traction on that then I do find the advice from John Muller to be highly irresponsible, he has been trying to keep people away from using disavows consistently saying that "we can ignore those spammy links". From this point as well I would hope that if links are so awful (2 years of traffic losses) there should be a manual action, which I would welcome! I have been at the company for a year and we haven't built any links but I would like to know if something historical is holding us back!

I have thought about setting up the GSC for those subdomains which I may have to now. As Google takes more traffic away the less money we have to look into those specific issues. I was trying to focus on improving the sites for users over the last year and not try to figure out what Google is doing crawling something we had behind a password and then redirected. It's almost like Googlebot has 0 trust in our site and basically ignores everything we try. 

I have taken a look at that link you shared and I can see our content on top, although you can see all the scraped content!

Thanks again,
Eamon
marked this as an answer
Relevant Answer
"edit: sincerely trying to be helpful in my criticism." - Please don't worry the critique is welcomed.

I think I have caught up, that's a great analysis. We are running on WordPress so naturally have a very flat structure. 

Our competitors tend to have the same type of pagination and the majority even share the same CMS. My personal belief is that this wouldn't cause substantial losses but will certainly not be optimal and will be contributing to some losses or rather holding us back. This is certainly going to be something I am going to look at fixing by expanding out the taxonomy in a slow and steady manner.

I do have a question if you don't mind. Essentially we're not blessed to be running a multimillion-pound ecommerce CMS that naturally creates structure due to the ways in which they merchandise product catalogues. Within WordPress our default is flat, so for example if I was looking into better structuring our best lists I could have our best wireless earbuds piece in both the best and maybe a sub-category of headphones. So in this example, the structure could look like -
  • /best
  • /best/headphones
  • /best/phones
The issue we run into is that WordPress naturally wants to create a flat structure so the same example with no development would be -
  • /best
  • /best-headphones
  • /best-etc
I know you could probably force it without development but it would be a mess and we're trying to keep things as clean as possible.

My preference would be the first option. What are your thoughts?

Thanks for flagging the .co.uk and 404s which I'll also look into :)

Again thanks.
marked this as an answer
This question is locked and replying has been disabled.
Discard post? You will lose what you have written so far.
Write a reply
10 characters required
Failed to attach file, click here to try again.
Discard post?
You will lose what you have written so far.
Personal information found

We found the following personal information in your message:

This information will be visible to anyone who visits or subscribes to notifications for this post. Are you sure you want to continue?

A problem occurred. Please try again.
Create Reply
Edit Reply
This will remove the reply from the Answers section.
Notifications are off
Your notifications are currently off and you won't receive subscription updates. To turn them on, go to Notifications preferences on your Profile page.
Report abuse
Google takes abuse of its services very seriously. We're committed to dealing with such abuse according to the laws in your country of residence. When you submit a report, we'll investigate it and take the appropriate action. We'll get back to you only if we require additional details or have more information to share.

Go to the Legal Help page to request content changes for legal reasons.

Reported post for abuse
Unable to send report.
Report post
What type of post are you reporting?
Google takes abuse of its services very seriously. We're committed to dealing with such abuse according to the laws in your country of residence. When you submit a report, we'll investigate it and take the appropriate action. We'll get back to you only if we require additional details or have more information to share.

Go to the Legal Help page to request content changes for legal reasons.

Reported post for abuse
Unable to send report.
This reply is no longer available.
/webmasters/threads
//accounts.google.com/ServiceLogin
You'll receive email notifications for new posts at
Unable to delete question.
Unable to update vote.
Unable to update subscription.
You have been unsubscribed
Deleted
Unable to delete reply.
Removed from Answers
Removed from Updates
Marked as Recommended Answer
Marked as Update
Removed recommendation
Undo
Unable to update reply.
Unable to update vote.
Thank you. Your response was recorded.
Unable to undo vote.
Thank you. This reply will now display in the answers section.
Link copied
Locked
Unlocked
Unable to lock
Unable to unlock
Pinned
Unpinned
Unable to pin
Unable to unpin
Marked
Unmarked
Unable to mark
Reported as off topic
Known Issue
Fixed
Marked Fixed
Unmarked Fixed
Unable to mark fixed
Unable to unmark fixed
/profile/0
false
Search
Clear search
Close search
Google apps
Main menu
Search Help Center
true
83844
false