Three layers decide whether an AI bot actually reaches your content: (1) robots.txt / ai.txt directives, (2) per-page meta robots + X-Robots-Tag, (3) CDN bot-protection rules that silently return 403 challenge pages to non-browser UAs. This tool audits all three for 16 major AI crawlers. Related: AI Posture Audit (broader discovery-surface), llms.txt Validator.