robots.txt file helps to pass message to search engine bots whether to allow or disallow a webpage or a directory for index
robots.txt file look like,
User-agent: * Disallow: /abc/
In Detail: User-agent: * Means this section applies to all robots
Disallow: Means that the robot should not visit any pages in the mentioned folder.
If you have a single or multiple XML sitemap then add in robots.txt file.
Below is an example of how multiple xml sitemap can be added to robots.txt
User-agent: *
Disallow: /includes/
Disallow: /scripts/
Sitemap: http://www.yoursite.com/sitemap1.xml
Sitemap: http://www.yoursite.com/sitemap2.xml
Sitemap: http://www.yoursite.com/sitemap3.xml
Read More...