A robots.txt file for a website helps to inform search engine bots whether to allow or disallow a file or a directory for search index
Typical robots.txt file will look like,
Disallow: - Means that the robot should not visit any pages in the mentioned folder.
If you have a single or multiple XML sitemap then do not miss the opportunity to add to the robots.txt file. Below is an example of how multiple xml sitemap can be added to robots.txt
Related Post: Everything about Meta Robots and robots.txt
Typical robots.txt file will look like,
User-agent: *User-agent: * - Means this section applies to all robots
Disallow: /includes/
Disallow: /scripts/
Disallow: - Means that the robot should not visit any pages in the mentioned folder.
If you have a single or multiple XML sitemap then do not miss the opportunity to add to the robots.txt file. Below is an example of how multiple xml sitemap can be added to robots.txt
User-agent: *
Sitemap: http://www.yoursite.com/sitemap1.xml
Sitemap: http://www.yoursite.com/sitemap2.xml
Sitemap: http://www.yoursite.com/sitemap3.xml
Disallow: /includes/
Disallow: /scripts/
Related Post: Everything about Meta Robots and robots.txt