Every outbound link on your page is a signal. Link to a .gov source and you're telling Google this content is grounded in official data. Link to a random WordPress blog with no byline and you're telling Google you couldn't find a better source. Most site owners never think about this. They link wherever is convenient and move on.
The problem is that search engines do think about it. Google's quality rater guidelines explicitly mention whether a page links to authoritative, well-sourced content. E-E-A-T isn't just about what's on your page. It's about the company your page keeps.
What the link graph actually reveals
When you map every outbound link on a page and classify each destination by authority tier, patterns emerge fast.
A well-researched article might link to two government sources, a university study, a major news outlet, and one or two industry-specific publications. That's a healthy link graph. The diversity signals that the author consulted multiple types of sources before publishing.
A thin affiliate page links to three product pages on Amazon and nothing else. That's a link graph that screams "this content exists to sell, not to inform."
Most pages fall somewhere in between, and that's where the audit gets interesting. You might discover that 80% of your outbound links go to a single domain. Or that every source you cite is the same political lean. Or that half your links are broken because the pages you referenced three years ago have since moved or died.
Authority tiers matter more than link count
The Link Graph Depth Audit classifies each outbound domain into tiers:
Government and institutional (.gov, .mil, WHO, CDC, official statistical agencies) carry the most weight. If you're making a factual claim and can back it with a government source, that's the strongest citation you can place.
Academic (.edu, research institutions, peer-reviewed journals) sit just below. A link to a published study on PubMed or JSTOR tells both readers and crawlers that you did the work.
Major publishers (NYT, Reuters, AP, BBC, WSJ) provide credibility through editorial standards. These organizations have fact-checking processes that smaller outlets don't.
Known industry sources (recognized trade publications, established blogs with editorial oversight) are solid but carry less implicit authority than the tiers above.
Unknown or low-authority domains are where most sites get into trouble. There's nothing wrong with linking to a small blog if the content is genuinely good. But if your entire link graph points to unknown sources, it raises questions about the quality of your research.
The diversity problem
Beyond authority tiers, the tool scores outbound-domain diversity. If every citation on your page goes to the same three domains, that's a concentration risk. Search engines want to see that you consulted a breadth of sources, not that you found one source and kept quoting it.
This is especially relevant for YMYL (Your Money or Your Life) content. Health, finance, legal, and safety content gets extra scrutiny. If you're writing about medication side effects and your only source is a single pharmaceutical company's website, that's a problem regardless of how authoritative that one source might be.
Primary source citation depth
The tool also measures how often you cite primary sources versus secondary coverage. Linking to a news article about a study is fine. Linking to the actual study is better. The deeper your citation chain goes toward the original source, the more credible your content appears.
This matters increasingly for AI search. When an LLM is deciding which content to cite in an AI Overview or a Perplexity answer, it's looking for content that demonstrates genuine research depth. Surface-level content that just summarizes other summaries gets passed over.
If you're building a content operation that needs to earn trust at scale, The $20 Dollar Agency ($9.99 on Kindle) covers how to set up quality standards that hold even when you're producing volume.
Fact-check notes and sources
- Google's Search Quality Rater Guidelines (v16.0, 2024) define E-E-A-T evaluation criteria including assessment of outbound link quality and source reliability.
- Google's "How Search Works" documentation describes link analysis as part of ranking systems.
- YMYL classification and heightened quality standards are detailed in Google's quality rater guidelines, Section 2.3.
Related reading
- Why the WCAG audit exists — another trust signal search engines evaluate
- Claims and attribution patterns — backing up what you say with evidence
- Trust signal surface audit — the full picture of credibility markers
- Passage retrieval and SEO — how search engines pick which section to cite
This post is informational, not SEO-consulting advice. Mentions of Google, Perplexity, and other third parties are nominative fair use. No affiliation is implied.