Loading content...
Loading content...
Create robots.txt files to control how search engines crawl your website. Set user-agent rules, allow/disallow paths, crawl delays, and sitemap locations.
Configure rules and click Generate
Upload to your website root as robots.txt
Enter the required information for your SEO element.
Customize settings and optional fields as needed.
Copy the generated code or download the file.
Robots.txt is a text file that tells search engine crawlers which pages they can or cannot access on your website.
Place it in your website's root directory (e.g., https://example.com/robots.txt).
Robots.txt blocks crawling but not indexing. Use noindex meta tags to prevent indexing.
Crawl-delay tells bots to wait a specified number of seconds between requests to avoid overloading your server.
Looking for more tools? Browse our complete collection of free conversion tools.
View All Tools