Robots.txt Generator
Generate robots.txt files with user-agent rules, allow/disallow paths, and sitemap directives.
Quick Presets
Rule Group 1
No disallow rules (everything is allowed)
Used by Yandex to specify the preferred domain.
Generated robots.txt
User-agent: * Allow: /
What is Robots.txt?
The robots.txt file tells search engine crawlers which pages they can or cannot access on your site.
Beoordeel deze tool
4.2 / 5 Β· 208 beoordelingen
Blijf op de hoogte
Ontvang wekelijkse dev-tips en nieuwe tools.
Geen spam. Altijd opzegbaar.
Enjoy these free tools?
βBuy Me a CoffeeHow to Use
- Select user-agent
- Add Allow/Disallow rules
- Enter sitemap URL
- Use presets for quick setup
- Copy or download
Use Cases
- Control crawling
- Block private pages
- Set crawl limits
- Reference sitemaps