I’m obsessed with checking my traffic stats and three days ago I woke up to find that my traffic from Google had dropped drastically. I thought Google Panda had struck once again and I kind of resigned to my fate until unease pushed me to do some research. I googled my most visited pages and discovered that the page descriptions were no longer there.
As shown in the screenshot below, instead of the page description Google was showing “A Description for this result is not available because of this site’s robots.txt – learn more”
I hadn’t made any changes to my robots.txt in months and my page descriptions were working just fine until now so I started reading up on Google compatible robots.txt and luckily I stumbled on this piece: Better Robots.txt Rules for WordPress. After reading it, I discovered that I had blocked access to my wordpress installation wp-includes and wp-content folders and this was against Google’s rules.
I quickly edited my robots.txt as advised by the author of that article and then submitted my sitemap to Google via the Fetch as Google tool in Google Search Console.
Three hours later most of the pages that had lost traffic due to this issue had their search engine ranking restored and all is back to normal.
So if you’re having this issue and you’re using self-hosted wordpress, check your robots.txt and make sure google has access to wp-includes and wp-content folders. If you’re not using wordpress, just make sure that your css and js files are easily accessible to Google’s bots.
Hopefully this solution will work for you.
UPDATE: Apparently this css and js issue is responsible for some Google Panda shake down too: Google Panda 4, and blocking your CSS & JS.
Please Rate This Post:You’ll also like:
Please share this article to help others. Thanks