Googlebot renders JavaScript. AI crawlers mostly don't.
GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, and CCBot (Common Crawl) fetch your raw HTML, parse it as a string, and move on. If you're running a client-side React or Vue app without server-side rendering, your raw HTML is a 1-2 KB shell that says "loading…" — and that's all the AI crawler sees.
The Render-Block Audit fetches your URL's raw HTML and scores it for what a non-JS crawler would actually extract.
What it checks
- Word count in raw
<body>. Under 200 = critical (SPA shell only). 200-500 = thin. 500+ = healthy. - H1 in raw HTML. Missing = crawlers have no title anchor. Multiple = pick one canonical.
- JS framework fingerprint. React / Next / Nuxt / Vue / Angular / Svelte / Gatsby / Remix / Astro signatures in the head + SPA containers (
#__next,#app,[data-reactroot]). - Render-blocking
<script>in<head>. Any non-async, non-defer script is a bottleneck. - Stylesheet
<link>count in<head>. 1-2 is fine; 3-4 is acceptable; 5+ delays first paint. - Inline CSS + JS in
<head>. Under 20 KB is clean; 20-60 KB is heavy; 60+ KB is rendering downstream. - Title + meta description present in raw HTML. Both should be server-rendered; client-injected versions are invisible to non-JS crawlers.
What to do if the audit fails
If JS framework detected + thin raw HTML:
- Static export where possible. Next.js
output: 'export', Astro static mode, Gatsby build. - SSR for critical routes. Next.js
getServerSideProps, Remix loaders, NuxtasyncData. Non-critical marketing pages can stay static. - Pre-rendering as a middle ground. Tools like
react-snap, Rendertron, or Cloudflare's HTML Rewriter inject server HTML for bot UAs. - Hybrid. Render critical content (title, meta, headings, first paragraph) server-side; hydrate the rest client-side.
For render-blocking scripts:
- Add
asyncfor analytics (doesn't depend on DOM). - Add
deferfor scripts that need DOM but don't block render. - Move non-critical scripts to just before
</body>.
For head-heavy CSS:
- Inline critical CSS (above-the-fold rules only).
- Defer the rest via
<link rel="stylesheet" media="print" onload="this.media='all'">or equivalent. - Combine stylesheets — fewer round trips.
The AI fix prompt
Emits a prioritized migration plan ordered by CTR impact: title/meta first (easiest to fix, biggest SERP effect), content next, performance last.
Related reading
- Core Web Vitals Audit — page-weight + LCP / CLS / INP
- CWV Fix Generator — emits patches
- Mobile Parity — desktop vs Android Chrome UA diff
- AI Crawler Access Auditor — checks allow/block before worrying about render
Fact-check notes and sources
- Googlebot renders JavaScript: developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics
- OpenAI GPTBot technical details: platform.openai.com/docs/bots
- Anthropic ClaudeBot: docs.anthropic.com/en/docs/agents-and-tools
- Core Web Vitals: web.dev/vitals
The $100 Network covers server-rendered static sites as the default — the AI-crawler-readiness baseline. The audit is how you verify it ships.