Web & SEO

Robots.txt Generator

Generate a basic robots.txt file with common block rules and sitemap output.

Browser-onlyInstant outputLocal data

Pourquoi utiliser cet outil

03

Assemble a starter robots.txt file for standard admin, search, and preview exclusions.

Use this when you need a clean robots.txt starting point instead of editing directives from scratch.

It keeps the common allow/disallow structure visible while you build.

Comment l'utiliser

03
01Enter the sitemap URL.
02Turn the block rules on or off.
03Copy the generated robots.txt output.

Vérifications rapides avant de copier

03

Confirm the input is the format you intended.

Scan the result before using it in a document, URL, config, or message.

Copy only the output you need.

FAQ

03
Is this enough for every site?

No. It is a starter file that should be reviewed against your routing and crawl policy.

Can I block specific bots by name?

Yes. Set the User-agent field to a specific bot name, or use * to apply rules to all crawlers.

Can I add a sitemap URL to the output?

Yes. The sitemap field adds a Sitemap: directive at the end of the generated file.

Outils liés

03

SEO Preflight Workspace

Run page launch checks for title, description, canonical, robots, sitemap, and hreflang in one workspace.

Meta & Social Preview

Plan meta title, description, Open Graph, and social card copy together, then hand off one clean tag bundle.