A lot of people look for example robots.txt files. Rightfully so, because robots.txt are easy to mess up.
In this article we’ll cover a simple and a slightly more advanced example robots.txt file.
A simple robots.txt file
Here’s an example of a simple robots.txt file that:
- allows all crawlers access
- lists the XML sitemap
User-agent: *
Disallow:
Sitemap: https://www.example.com/sitemap.xmlA slightly more advanced robots.txt file
Here’s an example of a slightly more advanced robots.txt file example that:
- prevents Googlebot from accessing
/shoes/ - prevents Bingbot from accessing
/socks/ - lists the XML sitemap
User-agent: googlebot
Disallow: /shoes/
User-agent: BingBot
Disallow: /socks/
Sitemap: https://www.example.com/sitemap.xml
