Web 與 SEO

Robots.txt Generator

Generate a basic robots.txt file with common block rules and sitemap output.

浏览器内处理即时输出本地数据

為什麼用這個工具

03

Assemble a starter robots.txt file for standard admin, search, and preview exclusions.

Use this when you need a clean robots.txt starting point instead of editing directives from scratch.

It keeps the common allow/disallow structure visible while you build.

怎麼使用

03
01Enter the sitemap URL.
02Turn the block rules on or off.
03Copy the generated robots.txt output.

複製前快速檢查

03

先确认输入格式符合你的预期。

在把结果用于文档、链接、配置或消息前快速扫一眼。

只复制你真正需要的输出。

常見問題

03
Is this enough for every site?

No. It is a starter file that should be reviewed against your routing and crawl policy.

Can I block specific bots by name?

Yes. Set the User-agent field to a specific bot name, or use * to apply rules to all crawlers.

Can I add a sitemap URL to the output?

Yes. The sitemap field adds a Sitemap: directive at the end of the generated file.

相關工具

03

SEO Preflight Workspace

Run page launch checks for title, description, canonical, robots, sitemap, and hreflang in one workspace.

Meta & Social Preview

Plan meta title, description, Open Graph, and social card copy together, then hand off one clean tag bundle.