×


Googlebot cannot access CSS and JS files – Resolve crawl errors ?

In the Google search console, you may see the message 'Googlebot cannot access CSS and JS files'.

In order to interpret a page, Googlebot needs to view it with the accompanying CSS and JavaScript files.

However, if Google cannot load them, it will cause errors in Google Search Console’s coverage report.

Here at Ibmi Media, as part of our Server Management Services, we regularly help our Customers to resolve related WordPress queries.

In this context, we shall look into how to fix the "Googlebot cannot access CSS and JS files" error on the WordPress site.


Nature of the error , Googlebot cannot access CSS and JS files 

Google focuses on providing better rankings to user-friendly websites.

To determine the user experience, Google needs access to be able to visit the site’s CSS and JavaScript files.

By default, WordPress does not block search bots from accessing any CSS or JS files.

However, we may accidentally block them while adding extra security measures or when we use the WordPress security plugin.

The major cause of this error is the accidental blocking of these resources using a .htaccess file or robots.txt.

This will restrict Googlebot from indexing CSS and JS files which may affect the site’s SEO performance.


How to resolve the error, Googlebot cannot access CSS and JS files ?

To fix this issue, start by locating the website's root folder and ensure that you do not block static resources.

If you do, find the files Google is unable to access on our website.

To see how Googlebot sees our website, click on Crawl » Fetch as Google in Google Search Console.

Then, click on the fetch and render button.

Once done, the result will appear in a row. It will show us what a user sees and what the Googlebot sees when it loads our site.

Any difference in the data means that Googlebot is not able to access CSS/JS files. In addition, we can see the links of CSS and JS files it was unable to access.

To find a list of blocked resources, go to Google Index » Blocked Resources.

Each of them will show the links to actual resources that Googlebot cannot access.

Mostly, it will be the CSS styles and JS files added by our WordPress plugins or theme.

In such a case, we need to edit the site’s robots.txt file which controls what Googlebot sees.

To edit it, we connect to our site using an FTP client. It will be in our site's root directory.

If we use the Yoast SEO plugin, we can edit it from SEO » Tools page and then click on File Editor.


For instance, below we see the site has disallowed access to some WordPress directories:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

Now we need to remove the lines that block Google’s access to CSS or JS files on our site’s front-end.

Typically, they will be in the plugins or themes folders. In addition, we need to remove wp-includes, such as jQuery.

Sometimes the robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it indexes all files.

At times, a few WordPress hosting providers may proactively block access to default WordPress folders for bots.

We can override this via:

User-agent: *
Allow: /wp-includes/js/

Once done, save the robots.txt file.


Eventually, visit the fetch as Google tool and click on the fetch and render button. Now compare the fetch results to see the resolution.


[Need help with fixing WordPress errors? We are here for you. ]


Conclusion

This guide covers website crawl errors, Googlebot cannot access CSS and JS files. Google bot and other search spiders will visit the robots.txt file of your website immediately after they hit the htaccess file.

Htaccess has rules to block ip addresses, redirect URLs, enable gzip compression, etc. The robots.txt will have a set of rules for the search engines too. 

They are the reason you received "Googlebot Cannot Access CSS and JS files". 

Robots.txt has few lines that will either block or allow crawling of files and directories. Google has started penalizing websites that block the crawling of js and css files.

The JavaScript and cascading style sheets are responsible for rendering your website and they handle forms, fire events, and so on.

If the JS is blocked, Google bot will not be able to crawl the code and it will consider the code as a spam or violation of link schemes. 

The same logic applies for the CSS files.


To resolve "Googlebot Cannot Access CSS And JS Files" Warning:

1. You need to remove following line: Disallow: /wp-includes/

Depending upon how you have configured your robots.txt file, it will fix most of the warnings. 

You will most likely see that your site has disallowed access to some WordPress directories like this:

User-agent: *

Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

2. You can override this in robots.txt by allowing access to blocked folders:

User-agent: *
Allow: /wp-includes/js/