The robots.txt file tells search engine crawlers what they can and cannot access when crawling your site. This file sits at the root of a site and helps to regulate these bots and crawlers so as not to overload your site with requests and manage what can get indexed.

We at PushON like to describe this file as the highway code for a site where the sitemap is the map used to find your way. These two files can be leveraged to keep search engines focused on the content that matters.

One important thing to remember is the robots.txt file is not mandatory, it is up to the crawler to obey them.