Web y SEO

Robots.txt Generator

Generate a basic robots.txt file with common block rules and sitemap output.

Browser-onlyInstant outputLocal data

Por qué usar esta herramienta

03

Assemble a starter robots.txt file for standard admin, search, and preview exclusions.

Use this when you need a clean robots.txt starting point instead of editing directives from scratch.

It keeps the common allow/disallow structure visible while you build.

Cómo usarla

03
01Enter the sitemap URL.
02Turn the block rules on or off.
03Copy the generated robots.txt output.

Revisiones rápidas antes de copiar

03

Confirm the input is the format you intended.

Scan the result before using it in a document, URL, config, or message.

Copy only the output you need.

Preguntas frecuentes

03
Is this enough for every site?

No. It is a starter file that should be reviewed against your routing and crawl policy.

Can I block specific bots by name?

Yes. Set the User-agent field to a specific bot name, or use * to apply rules to all crawlers.

Can I add a sitemap URL to the output?

Yes. The sitemap field adds a Sitemap: directive at the end of the generated file.

Herramientas relacionadas

03

SEO Preflight Workspace

Run page launch checks for title, description, canonical, robots, sitemap, and hreflang in one workspace.

Meta & Social Preview

Plan meta title, description, Open Graph, and social card copy together, then hand off one clean tag bundle.