Every comprehensive SEO audit has this problem. It scans 80 things. It flags 45 as warnings. About 15 of those 45 genuinely don't apply to your specific site, but the audit has no way to know that.
A real estate agent's personal site doesn't need an ORCID. A motel landing page doesn't need FAQPage schema. A law firm in one city doesn't need hreflang alternates. A local plumber doesn't need Wikidata. A commerce site without user reviews doesn't need AggregateRating. You see the pattern. The audit shows every warning. You mentally filter. The next month you run the audit again. You mentally filter the same 15 items again. You copy the AI fix prompt into Claude or ChatGPT and the LLM starts recommending rel=author on a generic e-commerce homepage because the warning was in the prompt.
The design choice
The Mega Analyzer and Site Analyzer now have a small N/A button on every warning row. Click it. The row goes gray and struck-through. The item is removed from your score. More importantly, the item is removed from the AI fix prompt the tool generates — the prompt that you copy into Claude or ChatGPT to get a prioritized fix plan.
A banner at the top of results shows how many items you've dismissed and the real-time adjusted score — recomputed against only the checks you haven't silenced. Click one N/A button and you watch the adjusted figure tick up in the same frame. No re-scan, no page reload. The original score stays in the overview card so the raw surface is never hidden; the adjusted figure in the banner shows what the site would score if the dismissed checks genuinely didn't apply. Both numbers coexist so you can cite either, depending on context (stakeholder report vs. internal prioritization).
One click on "Restore all" brings dismissed rows back. Clicking a dismissed row individually restores just that one.
The dismissal persists in localStorage, keyed by (tool, URL). Next time you run the audit on the same URL, your dismissals come back. If you export the scan (see the export article), the dismissal set travels with the JSON so you can share "here's the scan of this site, with these items already marked N/A because they don't apply" with a teammate or an AI agent.
Why the dismissal also filters the prompt
The prompt is the feature. A prioritized "fix this, then this, then this" list produced by an LLM given every signal from your site. If that list is polluted with 15 inapplicable warnings, the LLM gives you bad advice. Rankings of priority are wrong. Effort estimates are inflated. You end up reviewing 45 recommendations instead of 30.
Filtering the prompt is one line:
const naSet = window.jwGetNASet(toolName, R.url);
const filteredFails = R.crawl.fails.filter(c => !naSet.has(hash(c.title)));
Both the Mega Analyzer and Site Analyzer apply the same filter. Every check has a stable hash based on its title. Dismissed titles get filtered out of the fails array before the prompt assembles. The LLM sees "FAILED CHECKS — fix these:" followed by only the checks that actually apply to your site.
In the Mega Analyzer, the prompt also regenerates in real time: toggling a check's N/A status fires a jw-na-changed event that rebuilds the prompt on the spot. Switch to the Mega AI Prompt tab, the textarea already reflects your latest dismissals. No "re-run the scan" step, no stale clipboard content. Copy the prompt whenever — it's always current.
At the bottom of the prompt there's a small note: "NOTE — the following checks were marked Not Applicable by the user and are excluded from this fix plan: [list]." The LLM gets transparency about what was silenced, so if one of the dismissed items is actually important to the broader fix plan (for example, you dismissed rel=author but the plan recommends creating a Person schema anyway), the LLM can acknowledge the gap in its response. Nothing is hidden; dismissals are declared, not secret.
The guardrail against gaming
Dismissing inapplicable items is legitimate. Dismissing critical failures to inflate your score is not. The tool guards against this in two ways.
First: only warning and info severities can be dismissed. Critical fails (missing HTTPS, broken canonical, soft 404, empty SPA shell) cannot be marked N/A. The N/A button is not rendered on critical rows. You cannot click-to-hide a real production issue.
Second: the banner showing dismissed-count is always visible. There is no way to silently dismiss 20 items and produce a green report without the banner saying "20 checks marked Not Applicable." If you're sharing the scan with a stakeholder or a client, they see the dismissal count. Hiding it would require editing the tool source, at which point the integrity of the audit is your responsibility.
What to actually dismiss
Legitimate dismissals for a small-business site:
rel=authorand author-profile links, if the site has no bylined contentWikidata reference, unless you've published a bio or product to WikidataORCID, unless you have peer-reviewed publicationsAggregateRatingschema, if you have no reviews yetSpeakable schema, if your content isn't voice-readyhreflang x-default, if your site is single-locale onlyFAQPage schema, if your page has no Q&A contentJSON Feed discoverable, if your site has no syndicatable content (single-property landing page, one-off portfolio)
Legitimate dismissals for a specific page type:
- Most homepage warnings about "visible Updated-on stamp" — homepages are evergreen, a date stamp reads as weird there
Author bio linkson a policy page (privacy, terms)Internal linkswarning on a single-page site
Legitimate dismissals for a specific industry:
- Legal, medical, and financial sites often don't want rich snippets on sensitive pages — dismiss the
max-snippet:-1recommendation for those pages - Multi-location businesses managing LocalBusiness via Google Business Profile may not want geo meta inline on every page — dismiss on sub-pages
The common thread: dismissal expresses a policy decision about what applies to your site. It is not a workaround for real failures. Used that way, it turns a 45-item warning list into the 30 items that actually matter to you.
What happens to dismissed items in the Full Summary tab
The Full Summary tab in Mega Analyzer aggregates every check across all buckets. Dismissed items still appear there — struck-through, faded — so you can audit your own dismissals. If you later change your mind on a dismissal, you click the row and it un-dismisses. No rebuild of the audit required.
Persistence model
Every dismissal is stored in localStorage under jw-na-<tool>-<url>. The value is a JSON array of rule-id hashes. Two consequences:
-
The dismissal is scoped to the combination of tool + URL. Dismissing "rel=author" on your homepage does not dismiss "rel=author" warnings on your blog. That is intentional — rel=author may be irrelevant on your homepage (a company site) but required on your blog (bylined content).
-
Clearing your browser's localStorage clears all dismissals. If you want to preserve them across machines or share them with a teammate, export the scan (JSON file) and they can import it to restore both the scan data and the dismissal state.
Rule IDs and why they're hashes of the title
Every check's rule ID is derived by hashing its visible title. This means:
- Renaming a check breaks the dismissal (a renamed check reappears, which is correct — if the check's meaning changed, your old dismissal decision should not automatically apply)
- No coupling to a fragile numeric ID scheme that breaks when tools get rewritten
- Humans can read dismissal lists in the export JSON — the rule ID is opaque but every dismissal has an adjacent
titlefield for review
This trade-off favors stability of user decisions over stability of internal IDs. It costs us a tiny amount of fragility (if we change a check title, dismissals for that check fall off) in exchange for never breaking a user's dismissal set when we refactor internals.
Related reading
- Export your audit: make every SEO scan reproducible — companion feature; export the scan (with dismissals) as JSON, re-run or regenerate the prompt later
- max-snippet, max-image-preview, max-video-preview — one example of a warning most sites should fix, not dismiss
- dateModified and visible freshness stamps — another warning where "it's a marketing homepage, N/A" is a legitimate dismissal
- Geo meta for local-business pages — applicable to LocalBusiness, legitimately N/A for SaaS
Fact-check notes and sources
localStoragebrowser support: MDN Web Storage API- Severity levels (critical/warn/info) follow the taxonomy documented in WCAG's conformance evaluation methodology for binary pass/fail checks vs. advisory warnings
- The idea that LLMs benefit from transparent dismissal notes rather than silent filtering is discussed in Anthropic's prompt engineering guide — giving the model metadata about what was excluded is more robust than hiding it
rel=authorusage guidance: Google Search Central — Author Credits recommends the Article schema'sauthor.urlfield rather than the olderrel=authorlink, which is why some sites legitimately don't need it
Part of the jwatte.com audit toolkit. Run the Mega Analyzer on your own site to see the N/A toggles live. The dismissals persist across sessions; the AI fix prompt filters to only the checks you haven't dismissed.