Robots.txt Generator

Define rules for web crawlers to manage how your site is indexed and accessed. Fill in the fields below to instantly generate your robots.txt file.

`*` applies rules to all crawlers. Specify a bot name (like `Googlebot`) for specific rules. You can add more specific rules by typing a different user-agent.

Paths or directories that crawlers should NOT access. Each path must start with `/`. E.g., `/wp-includes/`, `/cgi-bin/`.

Paths or files within a disallowed directory that crawlers CAN access. Use this to create exceptions.

Inform crawlers about the location of your XML sitemap(s). Enter the full URL, e.g., `https://www.yourdomain.com/sitemap.xml`.

Specifies the delay between consecutive requests to your server. (Note: Googlebot does NOT use this directive. It is primarily for older or non-Google crawlers).

Generated robots.txt Preview: