Google Made Testing Robots.txt Files Easier


Google in their recent Webmasters blog has issued an updated method to test the robots.txt file and made it easy for the webmasters to work upon the chore functionality of "to crawl, or not crawl". As finding the directives within a large robots.txt file that are or were blocking individual URLs can be quite tricky; thus the search engine giant has shoot an update making the task a lot easier for the webmasters. This tool can be found under the Crawl section of the GWT.

Posted by Asaph Arnon, member of  Webmaster Tools team says that one just needs to update the new URL list and the webmaster tool will prompt whether they should be allowed or disallowed to crawl. "If Googlebot sees a 500 server error for the robots.txt file, we'll generally pause further crawling of the website."  The flexibility to edit the file, as well as reviewing older versions have also been provided.  

To upgrade the existing robots.txt file and see the new recommendations, one can double click on the file and see an updated Fetch as Google tool to render important pages on your website.

More News ...