ZeroUtil

Robots.txt Generator

Generate robots.txt files with user-agent rules, allow/disallow paths and sitemap.

Rule 1
Generated robots.txt
User-agent: *
Disallow:

How to Use the Robots.txt Generator

Add user-agent rules with allow and disallow paths to control how search engine crawlers access your site. Add your sitemap URL and generate a properly formatted robots.txt file.

Common Configurations

  • Allow all — Set User-agent to * with an empty Disallow to allow full access.
  • Block a folder — Add /private/ or /admin/ to the Disallow paths.
  • Block specific bots — Set User-agent to a specific crawler name like Googlebot or Bingbot.
  • Sitemap — Always include your sitemap URL for better crawl discovery.

Frequently Asked Questions

What is robots.txt?

Robots.txt is a text file placed in your site root that tells search engine crawlers which pages or sections they can or cannot access. It follows the Robots Exclusion Protocol.

Where do I place robots.txt?

Place it in the root directory of your website, accessible at yourdomain.com/robots.txt. It must be at the root level to be recognized by crawlers.

Does robots.txt block pages from appearing in Google?

No. Robots.txt prevents crawling, not indexing. Google may still index a URL if other pages link to it. Use the noindex meta tag to prevent indexing.

Can I have multiple user-agent rules?

Yes. You can define different rules for different crawlers. Specific user-agent rules take precedence over the wildcard (*) rules.

Ad

More SEO & Web Tools