← Back to Blog

Real User Monitoring without a server — RUM that stores to localStorage

Real User Monitoring without a server — RUM that stores to localStorage

Lab metrics lie. CrUX is ground-truth but lags 28 days. The only thing between lab and CrUX is RUM — Real User Monitoring — where every pageview sends its measured vitals to your own collector.

RUM tools cost money. Sentry Performance is $29+/mo. Raygun is $89+/mo. Calibre is $125+/mo. Even Cloudflare's RUM requires their paid tier.

There's a cheap alternative nobody builds: just don't send the data. Store it in the user's browser. If you want aggregates later, export from localStorage.

The RUM Beacon Snippet Generator produces exactly that — a self-contained JS snippet you paste into your site's <head>, captures Core Web Vitals on every pageview, writes to localStorage, and comes paired with an import-JSON dashboard.

What the snippet does

On every pageview:

  1. Checks sample rate (default 1.0 = every view; set 0.1 for 10% sample on high-traffic sites)
  2. Attaches PerformanceObserver listeners for:
    • largest-contentful-paint (LCP)
    • first-input (FID, deprecated but still sent by some browsers)
    • event (INP — the new input responsiveness metric replacing FID)
    • layout-shift (CLS, cumulatively summed excluding input-caused shifts)
    • paint (FCP — first contentful paint)
    • navigation timing (TTFB — time to first byte)
  3. When the page is about to unload (visibilitychange to hidden + pagehide), writes the collected entry to localStorage[key] as the first item in a capped array (default max 500).
  4. Exposes window.jwRumRead() and window.jwRumClear() for debugging.

Size: about 2 KB, unminified, no dependencies.

Why localStorage instead of a server endpoint

The obvious architecture for RUM is: collect in browser, send to server, aggregate in a database, show a dashboard. That works but requires:

  • A server endpoint (Netlify Function, etc.)
  • A datastore (Postgres, DynamoDB, KV, something)
  • A dashboard UI
  • Infrastructure cost + maintenance

For a solo site or a small team, this is overkill. Instead: the user's own browser stores the user's own session data. No data leaves their machine. No cookies, no fingerprinting, no GDPR exposure.

The trade-off: each user sees only THEIR own history. You as the site owner don't see aggregated data unless users export and share.

Solution: make it easy to share. Add an "Export my RUM" button to a hidden diagnostic page on your site. Users paste the JSON into the dashboard. You aggregate across manually-submitted samples from power users.

For most sites, 10-30 power-user samples per week is enough to spot trends. You don't need a million samples to know your LCP p75 is 3.2s on mobile.

How to use it

  1. Go to /tools/rum-beacon-generator/
  2. Customize the localStorage key (default jw-rum-data — change to something site-specific to avoid collisions)
  3. Set max entries (default 500 — older entries drop off to stay under localStorage quota)
  4. Set sample rate (default 1.0 — every pageview; set 0.1 or lower for high-traffic sites)
  5. Click Run. The tool emits a self-contained <script> block.
  6. Copy the snippet, paste into your site's <head> right after the opening tag.
  7. Deploy. Pageviews start populating localStorage immediately.

The companion dashboard

The same tool page includes an import-JSON dashboard:

  • Paste the JSON array from window.jwRumRead() (or an export button you add to your site)
  • Tool aggregates: p75 LCP / INP / CLS / FCP / TTFB, color-coded by Google thresholds
  • Shows top 15 URLs by sample count with per-URL p75

Typical workflow:

  1. Instrument your site (paste snippet)
  2. Wait a few days for data to accumulate
  3. On one of your own browsers that has the data, open DevTools console, run copy(JSON.stringify(jwRumRead())) — copies entire history to clipboard
  4. Open the dashboard, paste into the text area, click View dashboard

You see your actual LCP p75 from your own browsing. If you browse like a typical user, this is the most accurate RUM number you can get.

Privacy notes

The snippet collects:

  • Pathname + query string (not full URL — avoids PII in URL params)
  • User-Agent (first 120 chars — identifies browser + device, helpful for per-device breakdown)
  • Timestamp
  • Core Web Vitals numbers

It does NOT collect:

  • User identity, IP, cookies
  • Referrer
  • Page content, form values, or click paths
  • Any cross-domain data

All data stays in the visitor's own localStorage. No fetch, no beacon, no network request is made by the RUM snippet. You can verify this by watching the Network panel — the snippet never hits an endpoint.

For GDPR / CCPA — this is effectively "necessary for analytics" data stored locally and never transmitted. Document it in your privacy policy as local-only telemetry.

How it pairs with the rest of the stack

  • CrUX Field Data Probe — Google's CrUX data. Ground truth but 28-day lagged + requires site traffic.
  • Mega SEO Analyzer v2 — lab metrics (what Lighthouse would show).
  • RUM (this tool) — real-user current data from your own browsing / opted-in users.

The three complementary data sources: CrUX tells you what Google sees, RUM tells you what you see, Mega SEO tells you what's theoretically possible. Fix the worst gap between them.

When to upgrade to paid RUM

The localStorage RUM works great at small scale. Upgrade to paid when:

  • You need aggregated cross-user data without asking users to export
  • You want alerting on RUM regressions
  • You need INP attribution at the element level (which user action caused the slow INP?)
  • You need session replay on slow pageviews
  • You're serving enough traffic that opt-in export becomes a bottleneck

Sentry, Raygun, DebugBear, SpeedCurve, Calibre — all solid paid RUM options. Pick based on feature fit.

For the 90% case — "am I getting better or worse over time" — this tool is enough.

Limitations

  • No aggregation by default. Each user sees only their own data.
  • localStorage cap ~5-10 MB. 500 entries with full payload is well under cap. Drop max if you see issues.
  • INP support varies. Chrome + Edge report INP. Safari + Firefox are catching up. Metric will be null on older browsers.
  • No attribution. The tool records the metric value, not what caused it. For element-level attribution, use INP Attribution or a paid tool.
  • User must visit the export page on your site to extract data. Or you check their browser yourself.

Related reading

Fact-check notes and sources

This post is informational, not engineering or privacy advice. Mentions of Sentry, Raygun, SpeedCurve, Calibre, DebugBear, Cloudflare, Google, and similar products are nominative fair use. No affiliation is implied. Consult a qualified privacy officer for jurisdiction-specific compliance decisions when deploying any analytics snippet.

← Back to Blog

Accessibility Options

Text Size
High Contrast
Reduce Motion
Reading Guide
Link Highlighting
Accessibility Statement

J.A. Watte is committed to ensuring digital accessibility for people with disabilities. This site conforms to WCAG 2.1 and 2.2 Level AA guidelines.

Measures Taken

  • Semantic HTML with proper heading hierarchy
  • ARIA labels and roles for interactive components
  • Color contrast ratios meeting WCAG AA (4.5:1)
  • Full keyboard navigation support
  • Skip navigation link
  • Visible focus indicators (3:1 contrast)
  • 44px minimum touch/click targets
  • Dark/light theme with system preference detection
  • Responsive design for all devices
  • Reduced motion support (CSS + toggle)
  • Text size customization (14px–20px)
  • Print stylesheet

Feedback

Contact: jwatte.com/contact

Full Accessibility StatementPrivacy Policy

Last updated: April 2026