/webmasters/community?hl=en
/webmasters/community?hl=en
6/23/14
Original Poster
PamS1234

Google is Blocking their js in Robots.txt But Tell Us Not To.

I've read the FAQs and searched the help center. Yes
My URL is: 


It would seem we dropped a little more on rankings around the release of the Panda 4.0, which indicated that we needed to unblock our js and css for layout, etc.

In our webmaster's tools, the new google fetch shows the Partial layout. I fixed all our errors cause our robots.txt was blocking the js/css, but now when I try to check again, Google's adwords remarketing js is no causing the Partial. 

What concerns me is I have read some people that have actually tested the partial robots.txt and found that there is definitely a connection with some rankings dropping because of the Panda 4.0.

So I was just curious, why would Google put into place this tool that flat out says, do not block any css and js, but yet Google is doing it. And what's worst is, it is via a paid service Google service, which we spend a fair amount of money on. Then just to be knocked down on organic rankings via Google's own guidelines. If that was truly the case, then Google is saying, 'spend money, but don't worry, we will keep your organic rankings down so you will continue to have to just spend more money to be seen.'   

It's pretty sad I think this way, but can someone enlighten me on this and if it is hurting us?. :)

Community content may not be verified or up-to-date. Learn more.
Recommended Answers (2)
Were these answers helpful?
How can we improve them?
All Replies (12)
KORPG Kevin
6/23/14
KORPG Kevin
So I was just curious, why would Google put into place this tool that flat out says, do not block any css and js, but yet Google is doing it. 

I'm sure from Google's viewpoint they'd prefer to crawl everything regardless of what and where it is.
But that doesn't mean you have to let the bots go anywhere.

In short, Google advises you not to block those files. You're welcome to do so regardless.
All the Partial Response is informing you is that there is content on your pages Google can't crawl. That's all.

It's a warning or a notification for you to check deeper and make sure you really want to block the bots from crawling those files.

What concerns me is I have read some people that have actually tested the partial robots.txt and found that there is definitely a connection with some rankings dropping because of the Panda 4.0.

Citation please. 
ets
6/23/14
ets
For decent help, you'll need to post your URL, please. You can disguise it with a URL shortener such as goo.gl or bit.ly if you don't want the URL (or your site name) to appear here in full and be visible through Google searches.
6/24/14
Original Poster
PamS1234
Hi Kevin,

This was just one of the articles I read on the topic. https://yoast.com/google-panda-robots-css-js/

est, I was wasn't asking for site help, just the question about the robots.txt blocking js and the new tool that is showing Partial. I unblocked on my end, just the Google adwords remarking remains as being blocked by robots.txt.

Google did actually change their guidelines and added about not blocking js/css, and also added the tool in webmasters, I would think they would add this tool for a reason and most likely a way for everyone to fix their issues. 
ets
6/24/14
ets
Indeed, but your question seems to be phrased along the lines "We've suffered from Panda 4.0 and I think these specific factors might be responsible". Whereas my instinct tells me there are probably far more significant things wrong with your site that are causing the algorithmic demotion - and if you share the URL, we can suggest what they are :)
6/24/14
Original Poster
PamS1234
ets, Yes, I do feel we have suffered as well, however I know our site is suffering from a lot more than just the Panda 4.0. We hired an SEO company a while back and they did a lot of spammy back links and we ended up with a manual penalty. Needless to say we are in the process of trying to clean that up, but are still working on it. So I know that the spammy links are a main issue for us. 

You are more than welcome to take a look at the site http://goo.gl/wEovtD and I would love nothing more than for you to take a look. I always welcome any insight or information that may help us get this site back on track.

We had our js and css blocked in our robots.txt like I mentioned but unblocked them just the other day. The software cart we are using comes standard with their robots.txt that is block it by default.

Thanks for taking the time to look. 
6 MORE
ets
6/24/14
ets
So what ets was saying on his evaluation holds true or you are feeling it is mostly the back links?

I think you can be 1,000,000% confident that if John tells you it's the links, it's the links. So that's definitely your priority. I didn't examine the backlink issue because you'd already flagged it as a known problem. John has given a fairly clear steer that Panda is not your problem right now - so you can take what I said with a large pinch of salt. Even so, cleaning and tightening up your content/indexing is never a bad idea... and I think addressing some of the issues I raised (like the blocked files in robots.txt, the PDFs, and the URL parameters) is a good idea in the long run. But don't let it distract you from link cleaning.

Where the site is hosted doesn't play a role beyond any performance impact. You refer there to possible crawl issues - and that's where looking at the indexing can help. Because you clearly have a lot of URLs indexed twice (some blocked with query strings in robots.txt), which means you probably have a lot of unnecessary crawling as well. If the URL parameters were specified correctly, your crawling would be far more efficient.
 
This question is locked and replying has been disabled. Still have questions? Ask the Help Community.

Badges

Some community members might have badges that indicate their identity or level of participation in a community.

 
Google Employee — Google product team members and community managers
 
Community Specialist — Google partners who help ensure the quality of community content
 
Platinum Product Expert — Community members with advanced product knowledge who help other Google users and Product Experts
 
Gold Product Expert — Community members with in-depth product knowledge who help other Google users by answering questions
 
Silver Product Expert — Community members with intermediate product knowledge who help other Google users by answering questions
 
Product Expert Alumni — Former Product Experts who are no longer members of the program
Community content may not be verified or up-to-date. Learn more.

Levels

Member levels indicate a user's level of participation in a forum. The greater the participation, the higher the level. Everyone starts at level 1 and can rise to level 10. These activities can increase your level in a forum:

  • Post an answer.
  • Having your answer selected as the best answer.
  • Having your post rated as helpful.
  • Vote up a post.
  • Correctly mark a topic or post as abuse.

Having a post marked and removed as abuse will slow a user's advance in levels.

View profile in forum?

To view this member's profile, you need to leave the current Help page.

Report abuse in forum?

This comment originated in the Google Product Forum. To report abuse, you need to leave the current Help page.

Reply in forum?

This comment originated in the Google Product Forum. To reply, you need to leave the current Help page.