robots.txt & Meta Robots Builder + Tester
Prevent catastrophic noindex/disallow mistakes. Build valid robots.txt, test paths against Googlebot/Bingbot, and generate per-page meta robots tags.
robots.txt Builder
Add UA Block
Governance Presets
*) and end anchors ($) are supported in the tester. Without wildcards, paths use prefix matching.Rule Tester
*) and end anchors ($) supported. Otherwise prefix match.Meta Robots Builder
<meta name="robots" for page-level control. For per-bot control (e.g., Googlebot), you can also use name="googlebot" with the same content.Notes & Acceptance
- • Tester shows the first effective rule using a longest-match strategy; ties favor Allow.
- • Supports wildcard (
*) and end-anchor ($) patterns (Google extensions). - • Warns if
Disallow: /appears underUser-agent: *. - • Optional
Sitemap: https://example.com/sitemap.xmlline included at the end. - • Everything runs locally in your browser.
Most people will use these tools for free and never think about how much it costs to keep them online. That’s okay — we built Postly to help. But if our tools have saved you hours of work, helped you get better reach, or made client work easier, please pause for a moment and consider supporting us. Even a small contribution genuinely helps keep Postly alive, maintained, and improving. We are a tiny team, not a big company. We pay real infrastructure bills every month to keep this running. If you can, please be one of the few people who chooses to help.
If you’re not in a position to support us, that’s okay. Postly will still be here for you. But if you are one of the few who can chip in, your support directly keeps these tools online for everyone else.