SEO: Google Webmaster Guidelines Updated, Says Don't Block Javascript or CSS

Facebook Twitter Google + LinkedIn

Google has updated its technical Webmaster Guidelines to reflect its evolving indexing system, warning webmasters that they could face 'suboptimal rankings' if they prevent Google's bots from crawling CSS or Javascript files.

Recently, the search giant has made several big steps towards considering user experience (UX) as a ranking factor, including strongly hinting it will start using mobile UX when calculating rankings.

This latest Webmaster Guidelines change, announced in a Webmaster Central blog post on Monday, appears to be driven by similar aims. From the blog post:

We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on.

"For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS and image files your pages use.

"Disallowing crawling of JavaScript or CSS files in your site's robots.txt directly harms how well our algorithms render and index your content."

The post makes several more recommendations. These are presented in the context of making it easier for Googlebot to crawl sites, but could equally be considered best-practice guidance for UX:

  • Use progressive enhancement principles to ensure Googlebot's limitations don't get in the way of it 'seeing' usable content.
  • "Optimise the serving of your CSS and JavaScript files" and "elimiate unnecessary downloads" to improve page load speed.
  • Ensure your server is able to handle the extra load necessary to serve CSS and JavaScript files to Googlebot.

Internet Marketing News from ClickThrough - the conversion rate optimisation experts.

Subscribe to our blog and get the latest industry-updates direct to your inbox