Robots.txt Generator

Create a robots.txt file to guide search engines on how to crawl your website.

Default Access (All Robots)
"Allow All" lets Google crawl everything. "Disallow All" blocks everything (good for dev sites).
Sitemap (XML)
Block Specific Folders
File Preview
robots.txt