Free online tools to generate, calculate, convert, format, transform, and de/en-code.
 

robots.txt Generator

Generate a robots.txt file to control how search engine crawlers access your website. Manage crawling permissions and sitemap locations.


Quick Templates

Custom Configuration

Use * for wildcards, e.g., *.pdf or /temp/*
Delay between requests (not supported by all crawlers)

About robots.txt

The robots.txt file tells search engine crawlers which pages or sections of your site they can access. It's placed in the root directory of your website (e.g., https://example.com/robots.txt).

Common Directives

  • User-agent: Specifies which crawler the rules apply to
  • Disallow: URLs that should not be crawled
  • Allow: Exceptions to disallow rules
  • Sitemap: Location of XML sitemaps
  • Crawl-delay: Time between requests (seconds)

Best Practices

  • Place robots.txt in your site's root directory
  • Use specific user-agents when needed
  • Don't use robots.txt for security (use proper auth)
  • Include sitemap references
  • Test with Google Search Console
Important: robots.txt is not a security measure. Malicious crawlers can ignore it. Use proper authentication and permissions to protect sensitive content.

Feedback

Help us improve this page by providing feedback:


Share with