DevToolBox무료
블로그

Robots.txt 생성기

사용자 에이전트 규칙과 사이트맵 지시문이 포함된 robots.txt를 생성하세요.

Quick Presets
Rule Group 1
No disallow rules (everything is allowed)
Used by Yandex to specify the preferred domain.
Generated robots.txt
User-agent: *
Allow: /

What is Robots.txt?

The robots.txt file tells search engine crawlers which pages they can or cannot access on your site.

𝕏 Twitterin LinkedIn

이 도구 평가

4.2 / 5 · 208 개 평가

최신 소식 받기

주간 개발 팁과 새 도구 알림을 받으세요.

스팸 없음. 언제든 구독 해지 가능.

Enjoy these free tools?

Buy Me a Coffee

How to Use

  1. Select user-agent
  2. Add Allow/Disallow rules
  3. Enter sitemap URL
  4. Use presets for quick setup
  5. Copy or download

Use Cases

  • Control crawling
  • Block private pages
  • Set crawl limits
  • Reference sitemaps

FAQ

What is robots.txt?
A text file at your website root that instructs crawlers which URLs they can access.
Does it block indexing?
No, robots.txt controls crawling, not indexing. Use noindex for that.
Where to place it?
At your domain root: https://example.com/robots.txt