Google Warned Webmaster Users That Its Googlebot Won’t Crawl On CSS & JavaScript

|

google-search-console

Google has sent out special notifications through Search Console (formerly Webmaster Tools) last week, telling users about blocking CSS and JavaScript assets.

The warning indicated that blocking those assets can lead to “suboptimal rankings” in Google as the search engine (which analyses fully rendered pages) cannot understand a website. It is important to note for those receiving the notification, however, that this is not a penalty.

Google actually informed webmasters back in October 2014 about the harm that can come from blocking CSS and JS, but the notifications have never been seen before.

The message/notification sent by Google does provide some guidance on fixing the issue, with advice such as using the “Fetch as Google” feature to identify resources that robots.txt directives are blocking, revising robots.txt to remove those restrictions and resubmitting the file to Search Console.

Is there any solution?

Fortunately, getting around the problem is as easy as including the following code in your robots.txt file as Google only looks for local, embedded, blocked JS and CSS.

Here’s the code for your robots.txt file if you received the warning:

User-Agent: Googlebot
Allow: .js
Allow: .css