DevToolBoxฟรี
บล็อก

Robots.txt Generator

Generate robots.txt files with user-agent rules, allow/disallow paths, and sitemap directives.

Quick Presets
Rule Group 1
No disallow rules (everything is allowed)
Used by Yandex to specify the preferred domain.
Generated robots.txt
User-agent: *
Allow: /

What is Robots.txt?

The robots.txt file tells search engine crawlers which pages they can or cannot access on your site.

𝕏 Twitterin LinkedIn

ให้คะแนนเครื่องมือนี้

4.2 / 5 · 208 คะแนน

อัปเดตข่าวสาร

รับเคล็ดลับการพัฒนาและเครื่องมือใหม่ทุกสัปดาห์

ไม่มีสแปม ยกเลิกได้ตลอดเวลา

Enjoy these free tools?

Buy Me a Coffee

How to Use

  1. Select user-agent
  2. Add Allow/Disallow rules
  3. Enter sitemap URL
  4. Use presets for quick setup
  5. Copy or download

Use Cases

  • Control crawling
  • Block private pages
  • Set crawl limits
  • Reference sitemaps

FAQ

What is robots.txt?
A text file at your website root that instructs crawlers which URLs they can access.
Does it block indexing?
No, robots.txt controls crawling, not indexing. Use noindex for that.
Where to place it?
At your domain root: https://example.com/robots.txt