Googlebot Cannot Access CSS and JS Files [Fix]

Share

Googlebot Cannot Access CSS and JS Files

Do not panic if you have received a warning from Google in your email today.

They are sending out a new notification through the Google Search Console for websites that appear to be blocking their CSS and JavaScript assets.

This is what their email says:

Google Search Console

Googlebot cannot access CSS and JS files on http://www.seolinkbuilding.org/

To: Webmaster of http://www.seolinkbuilding.org/,

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly, so blocking access to these assets can result in sub-optimal rankings.

This update is based on Google technical Webmaster Guidelines, which they already posted on their blog on October 27, 2014. Here is an excerpt:

We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on. Today, we’re updating one of our technical Webmaster Guidelines in light of this announcement.

For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.

Read the full blog post here:

Google Warning: Googlebot Cannot Access CSS & JS Fix

WordPress: This is how a sample WordPress Robots.txt file look:

User-agent:  *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /recommended/
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /index.php
Disallow: /xmlrpc.php
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/

Since you can see that it is blocking the Googlebot from accessing the folder which contains JavaScript and CSS files, you need to delete those files.

You can delete these lines from your Robots.txt file

Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/

Your new Robots.txt should look like:

User-agent:  *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /recommended/
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /index.php
Disallow: /xmlrpc.php

This should solve the issue.

Joomla: Joomla users can remove /images/, /media/ and /templates/ lines from their Robots.txt file. However, this issue only arises if you are using older version of Joomla, since Joomla 3.3 this issue is fixed, and these lines are no longer blocked.

Joomla Robots.txt File

User-agent: *
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/

Dupal: Dupal users can add following lines to their robots.txt file:

User-agent: *
# JS/CSS
-Allow: /core/*.css$
-Allow: /core/*.js$
+Allow: /core/*.css
+Allow: /core/*.js
+Allow: /profiles/*.css
+Allow: /profiles/*.js
+# Images
+Allow: /core/*.gif
+Allow: /core/*.jpg
+Allow: /core/*.png
+Allow: /core/*.svg
+Allow: /profiles/*.gif
+Allow: /profiles/*.jpeg
+Allow: /profiles/*.jpg
+Allow: /profiles/*.png
+Allow: /profiles/*.svg

If you still not sure which line to remove from your robots.txt file, simply test your robots.txt with the robots.txt tester tool and remove the line which is causing an error.

Robots.txt Tester

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *