Many sites block AI crawlers by accident - default robots.txt templates often have a "User-agent: * Disallow: /" line that catches GPTBot too.
- Open /tools/robots-txt-checker.
- Drop your domain - we fetch /robots.txt and analyse it.
- The result shows every named AI crawler (14 total) as ALLOWED or BLOCKED.
- Below the audit, copy the recommended robots.txt - explicit User-agent groups for every AI crawler, sitemap + llms.txt references, tight admin Disallow.
- Replace your existing /robots.txt and re-run the audit to confirm.
Open in LLM Submitter
-
Open robots.txt checker ->Audit + recommended block.
/tools/robots-txt-checker