Lab metrics lie. CrUX is ground-truth but lags 28 days. The only thing between lab and CrUX is RUM — Real User Monitoring — where every pageview sends its measured vitals to your own collector.
RUM tools cost money. Sentry Performance is $29+/mo. Raygun is $89+/mo. Calibre is $125+/mo. Even Cloudflare's RUM requires their paid tier.
There's a cheap alternative nobody builds: just don't send the data. Store it in the user's browser. If you want aggregates later, export from localStorage.
The RUM Beacon Snippet Generator produces exactly that — a self-contained JS snippet you paste into your site's <head>, captures Core Web Vitals on every pageview, writes to localStorage, and comes paired with an import-JSON dashboard.
What the snippet does
On every pageview:
- Checks sample rate (default 1.0 = every view; set 0.1 for 10% sample on high-traffic sites)
- Attaches
PerformanceObserverlisteners for:largest-contentful-paint(LCP)first-input(FID, deprecated but still sent by some browsers)event(INP — the new input responsiveness metric replacing FID)layout-shift(CLS, cumulatively summed excluding input-caused shifts)paint(FCP — first contentful paint)navigationtiming (TTFB — time to first byte)
- When the page is about to unload (
visibilitychangetohidden+pagehide), writes the collected entry tolocalStorage[key]as the first item in a capped array (default max 500). - Exposes
window.jwRumRead()andwindow.jwRumClear()for debugging.
Size: about 2 KB, unminified, no dependencies.
Why localStorage instead of a server endpoint
The obvious architecture for RUM is: collect in browser, send to server, aggregate in a database, show a dashboard. That works but requires:
- A server endpoint (Netlify Function, etc.)
- A datastore (Postgres, DynamoDB, KV, something)
- A dashboard UI
- Infrastructure cost + maintenance
For a solo site or a small team, this is overkill. Instead: the user's own browser stores the user's own session data. No data leaves their machine. No cookies, no fingerprinting, no GDPR exposure.
The trade-off: each user sees only THEIR own history. You as the site owner don't see aggregated data unless users export and share.
Solution: make it easy to share. Add an "Export my RUM" button to a hidden diagnostic page on your site. Users paste the JSON into the dashboard. You aggregate across manually-submitted samples from power users.
For most sites, 10-30 power-user samples per week is enough to spot trends. You don't need a million samples to know your LCP p75 is 3.2s on mobile.
How to use it
- Go to /tools/rum-beacon-generator/
- Customize the localStorage key (default
jw-rum-data— change to something site-specific to avoid collisions) - Set max entries (default 500 — older entries drop off to stay under localStorage quota)
- Set sample rate (default 1.0 — every pageview; set 0.1 or lower for high-traffic sites)
- Click Run. The tool emits a self-contained
<script>block. - Copy the snippet, paste into your site's
<head>right after the opening tag. - Deploy. Pageviews start populating localStorage immediately.
The companion dashboard
The same tool page includes an import-JSON dashboard:
- Paste the JSON array from
window.jwRumRead()(or an export button you add to your site) - Tool aggregates: p75 LCP / INP / CLS / FCP / TTFB, color-coded by Google thresholds
- Shows top 15 URLs by sample count with per-URL p75
Typical workflow:
- Instrument your site (paste snippet)
- Wait a few days for data to accumulate
- On one of your own browsers that has the data, open DevTools console, run
copy(JSON.stringify(jwRumRead()))— copies entire history to clipboard - Open the dashboard, paste into the text area, click View dashboard
You see your actual LCP p75 from your own browsing. If you browse like a typical user, this is the most accurate RUM number you can get.
Privacy notes
The snippet collects:
- Pathname + query string (not full URL — avoids PII in URL params)
- User-Agent (first 120 chars — identifies browser + device, helpful for per-device breakdown)
- Timestamp
- Core Web Vitals numbers
It does NOT collect:
- User identity, IP, cookies
- Referrer
- Page content, form values, or click paths
- Any cross-domain data
All data stays in the visitor's own localStorage. No fetch, no beacon, no network request is made by the RUM snippet. You can verify this by watching the Network panel — the snippet never hits an endpoint.
For GDPR / CCPA — this is effectively "necessary for analytics" data stored locally and never transmitted. Document it in your privacy policy as local-only telemetry.
How it pairs with the rest of the stack
- CrUX Field Data Probe — Google's CrUX data. Ground truth but 28-day lagged + requires site traffic.
- Mega SEO Analyzer v2 — lab metrics (what Lighthouse would show).
- RUM (this tool) — real-user current data from your own browsing / opted-in users.
The three complementary data sources: CrUX tells you what Google sees, RUM tells you what you see, Mega SEO tells you what's theoretically possible. Fix the worst gap between them.
When to upgrade to paid RUM
The localStorage RUM works great at small scale. Upgrade to paid when:
- You need aggregated cross-user data without asking users to export
- You want alerting on RUM regressions
- You need INP attribution at the element level (which user action caused the slow INP?)
- You need session replay on slow pageviews
- You're serving enough traffic that opt-in export becomes a bottleneck
Sentry, Raygun, DebugBear, SpeedCurve, Calibre — all solid paid RUM options. Pick based on feature fit.
For the 90% case — "am I getting better or worse over time" — this tool is enough.
Limitations
- No aggregation by default. Each user sees only their own data.
- localStorage cap ~5-10 MB. 500 entries with full payload is well under cap. Drop max if you see issues.
- INP support varies. Chrome + Edge report INP. Safari + Firefox are catching up. Metric will be null on older browsers.
- No attribution. The tool records the metric value, not what caused it. For element-level attribution, use INP Attribution or a paid tool.
- User must visit the export page on your site to extract data. Or you check their browser yourself.
Related reading
- Rendered DOM Paste Audit — paired release
- CrUX Field Data Probe walkthrough
- Every new performance audit tool
Fact-check notes and sources
- Web Vitals metric definitions: web.dev/vitals.
- PerformanceObserver API: MDN PerformanceObserver.
- INP replaced FID: Google March 2024 transition — Chrome INP docs.
- Page Visibility API for beacon flush: MDN Page Visibility.
- RUM vendor pricing references: Sentry Performance, Raygun, Calibre as of 2026-04.
This post is informational, not engineering or privacy advice. Mentions of Sentry, Raygun, SpeedCurve, Calibre, DebugBear, Cloudflare, Google, and similar products are nominative fair use. No affiliation is implied. Consult a qualified privacy officer for jurisdiction-specific compliance decisions when deploying any analytics snippet.