How do I check my robots.txt for AI bots?

Use /tools/robots-txt-checker. Reports each AI crawler as allowed/blocked + drops a paste-ready replacement.

Last updated 2 May 2026

Many sites block AI crawlers by accident - default robots.txt templates often have a "User-agent: * Disallow: /" line that catches GPTBot too.

  1. Open /tools/robots-txt-checker.
  2. Drop your domain - we fetch /robots.txt and analyse it.
  3. The result shows every named AI crawler (14 total) as ALLOWED or BLOCKED.
  4. Below the audit, copy the recommended robots.txt - explicit User-agent groups for every AI crawler, sitemap + llms.txt references, tight admin Disallow.
  5. Replace your existing /robots.txt and re-run the audit to confirm.

Open in LLM Submitter

Was this helpful?