This guide covers website crawl errors, Googlebot cannot access CSS and JS files. Google bot and other search spiders will visit the robots.txt file of your website immediately after they hit the htaccess file.
Htaccess has rules to block ip addresses, redirect URLs, enable gzip compression, etc. The robots.txt will have a set of rules for the search engines too.
They are the reason you received "Googlebot Cannot Access CSS and JS files".
Robots.txt has few lines that will either block or allow crawling of files and directories. Google has started penalizing websites that block the crawling of js and css files.
If the JS is blocked, Google bot will not be able to crawl the code and it will consider the code as a spam or violation of link schemes.
The same logic applies for the CSS files.
To resolve "Googlebot Cannot Access CSS And JS Files" Warning:
1. You need to remove following line: Disallow: /wp-includes/
Depending upon how you have configured your robots.txt file, it will fix most of the warnings.
You will most likely see that your site has disallowed access to some WordPress directories like this:
2. You can override this in robots.txt by allowing access to blocked folders: