In the Google search console, you may see the message 'Googlebot cannot access CSS and JS files'.
However, if Google cannot load them, it will cause errors in Google Search Console’s coverage report.
Here at Ibmi Media, as part of our Server Management Services, we regularly help our Customers to resolve related WordPress queries.
In this context, we shall look into how to fix the "Googlebot cannot access CSS and JS files" error on the WordPress site.
Google focuses on providing better rankings to user-friendly websites.
By default, WordPress does not block search bots from accessing any CSS or JS files.
However, we may accidentally block them while adding extra security measures or when we use the WordPress security plugin.
The major cause of this error is the accidental blocking of these resources using a .htaccess file or robots.txt.
This will restrict Googlebot from indexing CSS and JS files which may affect the site’s SEO performance.
To fix this issue, start by locating the website's root folder and ensure that you do not block static resources.
If you do, find the files Google is unable to access on our website.
To see how Googlebot sees our website, click on Crawl » Fetch as Google in Google Search Console.
Then, click on the fetch and render button.
Once done, the result will appear in a row. It will show us what a user sees and what the Googlebot sees when it loads our site.
Any difference in the data means that Googlebot is not able to access CSS/JS files. In addition, we can see the links of CSS and JS files it was unable to access.
To find a list of blocked resources, go to Google Index » Blocked Resources.
Each of them will show the links to actual resources that Googlebot cannot access.
Mostly, it will be the CSS styles and JS files added by our WordPress plugins or theme.
In such a case, we need to edit the site’s robots.txt file which controls what Googlebot sees.
To edit it, we connect to our site using an FTP client. It will be in our site's root directory.
If we use the Yoast SEO plugin, we can edit it from SEO » Tools page and then click on File Editor.
For instance, below we see the site has disallowed access to some WordPress directories:
Now we need to remove the lines that block Google’s access to CSS or JS files on our site’s front-end.
Typically, they will be in the plugins or themes folders. In addition, we need to remove wp-includes, such as jQuery.
Sometimes the robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it indexes all files.
At times, a few WordPress hosting providers may proactively block access to default WordPress folders for bots.
We can override this via:
Once done, save the robots.txt file.
Eventually, visit the fetch as Google tool and click on the fetch and render button. Now compare the fetch results to see the resolution.
This guide covers website crawl errors, Googlebot cannot access CSS and JS files. Google bot and other search spiders will visit the robots.txt file of your website immediately after they hit the htaccess file.
Htaccess has rules to block ip addresses, redirect URLs, enable gzip compression, etc. The robots.txt will have a set of rules for the search engines too.
They are the reason you received "Googlebot Cannot Access CSS and JS files".
Robots.txt has few lines that will either block or allow crawling of files and directories. Google has started penalizing websites that block the crawling of js and css files.
If the JS is blocked, Google bot will not be able to crawl the code and it will consider the code as a spam or violation of link schemes.
The same logic applies for the CSS files.
To resolve "Googlebot Cannot Access CSS And JS Files" Warning:
1. You need to remove following line: Disallow: /wp-includes/
Depending upon how you have configured your robots.txt file, it will fix most of the warnings.
You will most likely see that your site has disallowed access to some WordPress directories like this:
2. You can override this in robots.txt by allowing access to blocked folders: