Robots.txt Generator
Generate robots.txt files with user-agent rules, allow/disallow paths, and sitemap directives.
Quick Presets
Rule Group 1
No disallow rules (everything is allowed)
Used by Yandex to specify the preferred domain.
Generated robots.txt
User-agent: * Allow: /
What is Robots.txt?
The robots.txt file tells search engine crawlers which pages they can or cannot access on your site.
Rate this tool
4.2 / 5 ยท 208 ratings
Stay Updated
Get weekly dev tips and new tool announcements.
No spam. Unsubscribe anytime.
Enjoy these free tools?
โBuy Me a CoffeeHow to Use
- Select user-agent
- Add Allow/Disallow rules
- Enter sitemap URL
- Use presets for quick setup
- Copy or download
Use Cases
- Control crawling
- Block private pages
- Set crawl limits
- Reference sitemaps