Robots.txt Generator

Create robots.txt files to control search engine crawling and indexing.

Enter your sitemap URL (optional)
Example: /admin/, /private/, /temp/
Example: /public/, /images/, /css/
Time between crawler requests (0 for no delay)
Generated Robots.txt
Valid

                        

Configure your settings and click "Generate Robots.txt" to see results

About Robots.txt

The robots.txt file tells search engine crawlers which pages or files they can or cannot request from your site.

Best Practices:
  • Place in root directory
  • Use clear directives
  • Test with Google Search Console
  • Keep it simple and specific
Advertisement