Web y SEO
Robots.txt Generator
Generate a basic robots.txt file with common block rules and sitemap output.
Por qué usar esta herramienta
03Assemble a starter robots.txt file for standard admin, search, and preview exclusions.
Use this when you need a clean robots.txt starting point instead of editing directives from scratch.
It keeps the common allow/disallow structure visible while you build.
Cómo usarla
03Revisiones rápidas antes de copiar
03Confirm the input is the format you intended.
Scan the result before using it in a document, URL, config, or message.
Copy only the output you need.
Preguntas frecuentes
03
Is this enough for every site?
No. It is a starter file that should be reviewed against your routing and crawl policy.
Can I block specific bots by name?
Yes. Set the User-agent field to a specific bot name, or use * to apply rules to all crawlers.
Can I add a sitemap URL to the output?
Yes. The sitemap field adds a Sitemap: directive at the end of the generated file.
Herramientas relacionadas
03Crawl & Sitemap Manager→
Generate robots.txt, sitemap inventory, canonical host rules, and hreflang output in one workspace.
SEO Preflight Workspace→
Run page launch checks for title, description, canonical, robots, sitemap, and hreflang in one workspace.
Meta & Social Preview→
Plan meta title, description, Open Graph, and social card copy together, then hand off one clean tag bundle.