In the Google search console, you may see the message 'Googlebot cannot access CSS and JS files'.
However, if Google cannot load them, it will cause errors in Google Search Console’s coverage report.
In this context, we shall look into how to fix the "Googlebot cannot access CSS and JS files" error on the WordPress site.
Nature of the error , Googlebot cannot access CSS and JS files
Google focuses on providing better rankings to user-friendly websites.
By default, WordPress does not block search bots from accessing any CSS or JS files.
However, we may accidentally block them while adding extra security measures or when we use the WordPress security plugin.
The major cause of this error is the accidental blocking of these resources using a .htaccess file or robots.txt.
This will restrict Googlebot from indexing CSS and JS files which may affect the site’s SEO performance.
How to resolve the error, Googlebot cannot access CSS and JS files ?
To fix this issue, start by locating the website's root folder and ensure that you do not block static resources.
If you do, find the files Google is unable to access on our website.
To see how Googlebot sees our website, click on Crawl » Fetch as Google in Google Search Console.
Then, click on the fetch and render button.
Once done, the result will appear in a row. It will show us what a user sees and what the Googlebot sees when it loads our site.
Any difference in the data means that Googlebot is not able to access CSS/JS files. In addition, we can see the links of CSS and JS files it was unable to access.
To find a list of blocked resources, go to Google Index » Blocked Resources.
Each of them will show the links to actual resources that Googlebot cannot access.
Mostly, it will be the CSS styles and JS files added by our WordPress plugins or theme.
In such a case, we need to edit the site’s robots.txt file which controls what Googlebot sees.
To edit it, we connect to our site using an FTP client. It will be in our site's root directory.
If we use the Yoast SEO plugin, we can edit it from SEO » Tools page and then click on File Editor.
For instance, below we see the site has disallowed access to some WordPress directories:
Now we need to remove the lines that block Google’s access to CSS or JS files on our site’s front-end.
Typically, they will be in the plugins or themes folders. In addition, we need to remove wp-includes, such as jQuery.
Sometimes the robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it indexes all files.
At times, a few WordPress hosting providers may proactively block access to default WordPress folders for bots.
We can override this via:
Once done, save the robots.txt file.
Eventually, visit the fetch as Google tool and click on the fetch and render button. Now compare the fetch results to see the resolution.