This Might Be the Reason for Your Website Ranking Poor on Search Engines

31 August 2015

Dublin: Do you have an account in Google Webmaster Tools (recently name changed to Search Console)? If yes, just check out your inbox and don’t be wondered if you have a mail from Google saying that the GoogleBot is not able to access your CSS and JS files. And you know what? This might also be a reason for your site ranking poor on search engines. GoogleBot is the main web crawling bot of Google which is also called as ‘Spider’. Basically, the GoogleBot collects all the documents from the web and builds a searchable index for the Google search engine.

So why GoogleBot is not able to access your CSS and JavaScript files?  What happens if it can’t access CSS and JS files? How to make those files accessible? Here is the solution.

Firstly, the GoogleBot is not able to access your CSS and JS files because of the restrictions in your robots.txt file. If you block these assets, Google will not be able to understand your site fully and this would ultimately result in your site getting sub optimal ranking on the search engine.  The webmasters are seeing it as Google’s new focus on the responsive design. But this might not be the only reason for that.

Google sending out the rush of warnings

Google has found that many sites have restricted the access to robots.txt files and hence it is now sending out a rush of warnings via Google Search Console (formerly known as Google Webmaster Tools) to notify webmasters that the GoogleBot is not able to access the CSS and JS files on their site because of the restrictions on robots.txt files.

It seems like the webmasters are now concerned about their sites after receiving the warning. No matter whether you are using Joomla, WordPress or any other content management systems, you need to make sure that you provide GoogleBot the access to your CSS and JS files.

How to make CSS and JS files accessible?

  • Just login to your Google Webmaster Tools account and navigate to the dashboard of your site.  Now click on Google Index and then on Blocked resources. Now check if the search console is showing any blocked resources of your site.
     
  • Then under the host column, you can click on the domain name which displays all the files that are blocked for search engine bots. If you find that the entries are blocked by you just for the better crawling, you don’t have to worry about it.
     
  • But if you find the files such as .js and .css that are essential for the site display, you need to edit the robots.txt file of your website or blog. This is applicable for all the WordPress blogs and other popular CMSs.

As soon as you unblock the CSS and JavaScript files, you can submit the URL to be crawled again or just wait for GoogleBot to do it on its own.

Are you planning to hire web developers in Dublin? Fortune Innovations is one of the leading web development companies in Dublin having considerable experience in field. We have masterly developers who have up-to-date knowledge on the current trends of web development. Join your hands with us if you have any requirements on SEO, mobile apps, web design and or web development.

News Archive

  • WordPress Development in Dublin
  • Drupal Development in Dublin
  • Joomla Development in Dublin
  • eCommerce Magento Dublin
  • Web Development Dublin
  • Web Design Dublin
  • jQuery development Dublin
  • Zend framework development Dublin
  • Airline IBE GDS Integration Navitaire Dublin
  • Airline IBE GDS Integration Aamadeus Dublin