Let Googlebot crawl JavaScript and CSS

by Yves Dagenais on April 26 2012

Allowing Googlebot to crawl your JavaScript and CSS content is a good thing explains Matt Cutts, Google’s head of the Webspam team.

The explanation from Matt is that by blocking JavaScript and CSS in your robots.txt, you are effectively limiting Googlebot’s ability to understand and possible navigate the content on your site. This is especially true with today’s complex web sites that cleverly use JavaScript and CSS ingenuity to create wonderful user experiences. Here is a link to Matt’s PSA:

While Googlebot wants to understand your JavaScript and CSS, it may not. So keep an eye on your Google Webmaster‘s Dashboard for any errors from Googlebot regarding broken links it might have misinterpreted. This is especially important for those who might be generating dynamic links using JavaScript.

Leave a Comment