You run Lighthouse, get a 95, and move on. Then three months later your organic traffic dips and you can't figure out why.
The problem is that Lighthouse is a lab test. It runs on your machine, with your hardware, on your network, with a clean cache. It measures what your site could do under ideal conditions. Google doesn't rank on ideal conditions.
Google ranks on CrUX data. The Chrome User Experience Report collects real performance metrics from actual Chrome users who visit your site. Their phones. Their cell connections. Their crowded browser tabs. That's the data that feeds the Core Web Vitals ranking signal.
The metrics that matter
CrUX reports five numbers at the 75th percentile:
LCP (Largest Contentful Paint) measures when the main visual content finishes loading. Google wants this under 2.5 seconds. On a 2024 mid-range Android phone on a 4G connection, that's tight. Your hero image, your web font, your render-blocking CSS all contribute.
INP (Interaction to Next Paint) replaced FID in March 2024 and measures responsiveness across the entire page session, not just the first click. Every tap, every keystroke, every scroll that triggers a handler. Under 200ms is good. Over 500ms is poor. Most sites have never measured this.
CLS (Cumulative Layout Shift) tracks visual stability. Every time an element jumps because an ad loaded or a font swapped or an image rendered without dimensions, that's a layout shift. Under 0.1 is good. Sites with lazy-loaded ads above the fold routinely fail this.
FCP (First Contentful Paint) is when the first text or image pixel appears. Under 1.8 seconds is good. This one catches render-blocking resources that delay the entire paint pipeline.
TTFB (Time to First Byte) measures server response time. Under 800ms is good. This is where slow hosting, missing edge caching, and unoptimized server-side rendering show up.
Why lab and field diverge
A Lighthouse run on a MacBook Pro with a wired connection tells you almost nothing about what a user on a Pixel 6a on T-Mobile in rural Texas experiences. Common divergence patterns:
Your lab LCP is 1.2s because your machine has the font cached. Field LCP is 3.4s because real users hit cold caches and your Google Fonts stylesheet blocks rendering.
Your lab INP is unmeasurable because Lighthouse doesn't interact with your page the way humans do. Field INP is 380ms because your click handler triggers a re-render that blocks the main thread for 200ms.
Your lab CLS is 0.02 because ads don't load in Lighthouse. Field CLS is 0.35 because your ad network injects a 250px banner after the page paints.
What the tool does
The CrUX Field Data Probe queries the PageSpeed Insights API for your URL's real-user performance data. No API key needed. You get the actual p75 numbers Google uses, color-coded against the Core Web Vitals thresholds, alongside the Lighthouse lab scores for direct comparison.
When there's a gap between lab and field, the tool flags it. That gap is where your ranking signal lives.
What to do with the numbers
If your field LCP is failing but lab passes, look at font loading strategy, image optimization, and server response time under real load. If INP is failing, profile your JavaScript event handlers for main-thread blocking. If CLS is failing, check for dynamically injected content without reserved space.
The point isn't to chase a number. It's to know which number you're chasing. Lighthouse gives you a synthetic benchmark. CrUX gives you the measurement Google actually uses to decide whether your page is fast enough to rank.
If you're building a site from scratch and want to get these fundamentals right from day one, The $97 Launch covers the performance baseline alongside the business fundamentals.
Fact-check notes and sources
- CrUX data collection methodology: Chrome User Experience Report documentation
- INP replaced FID as a Core Web Vital in March 2024: web.dev announcement
- Core Web Vitals thresholds (LCP 2.5s, INP 200ms, CLS 0.1): web.dev/vitals
- PageSpeed Insights API serves CrUX origin and URL-level data: PSI API documentation
Related reading
- Lab vs field: closing the Core Web Vitals gap
- Why the INP attribution tool exists
- Image LCP candidate pitfalls
- Font loading strategy and its LCP impact
This post is informational, not SEO-consulting advice. Mentions of Google, Chrome, Lighthouse, and PageSpeed Insights are nominative fair use. No affiliation is implied.