AI Search Cites What Brands Control

Guide Chapters

(click to expand) What Yext Actually Found Translating the Findings: AEO, GEO, AIO, SEO & E-E-A-T What This Means for Law Firms Specifically A Citation-First Playbook for Law Firms Example Measurement Framework FAQ References 🎯 Key Takeaways 86% of AI

AI Search Research · Law Firm Strategy

AI Search Cites What Brands Control: What Yext’s 6.8M-Citation Study Means for Law Firms

A 6.8 million-citation analysis across ChatGPT, Gemini and Perplexity found that 86% of AI citations come from sources brands already manage. Here is what that means for law firm AEO, GEO, AIO, SEO and E-E-A-T strategy — with the caveats that matter for legal marketers.

📑 Table of Contents (click to expand)

🎯 Key Takeaways

  • 86% of AI citations come from sources brands already control — websites, listings, and reviews/social — per Yext’s analysis of 6.8M citations across ChatGPT, Gemini and Perplexity (Yext Research, October 9, 2025; data collected July 1–August 31, 2025).
  • First-party websites generated 44% of citations (2.9M), listings 42% (2.9M), reviews/social 8% (545K), and forums like Reddit just 2% once location and query intent were applied (Yext, October 9, 2025).
  • Models source differently: Gemini favored websites (52.1%), OpenAI/ChatGPT leaned on listings (48.7%), and Perplexity diversified across directories like MapQuest and TripAdvisor (Yext, October 9, 2025).
  • The study covered four industries — retail, financial services, healthcare and food service — and did not include legal services, so direct extrapolation to law firms requires care (Yext methodology disclosure, October 9, 2025).
  • For law firms, the directional implication is clear: invest in first-party site architecture, accurate local listings, structured data, and review signals before chasing Reddit threads or off-site “hacks.”

Yext analyzed 6.8 million AI citations across ChatGPT, Gemini and Perplexity between July and August 2025 and found that 86% came from brand-controlled sources — websites, listings, and reviews. For law firms, that finding reframes AEO, GEO, AIO and SEO around one practical idea: own the sources AI models cite, and citation visibility follows.

A widely-shared social commentary from marketer Ben Wills surfaced the same tension many legal marketers are now wrestling with: a CMO insists AEO and GEO are “a load of crap” and that good SEO is enough, while new research from Yext appears to back the SEO-first view by showing that most AI citations come from sources brands already own. The reality is more nuanced — and worth working through carefully before reallocating any law firm marketing budget.The October 2025 Yext study is the largest publicly-disclosed citation analysis to date. It does not, however, study legal services directly, and its methodology has specific limits that matter for attorneys. At InterCore’s GEO services hub, we work with personal injury, family law and criminal defense firms across 24+ states, and the operational lesson from this study is not “abandon AEO” — it is “make sure the sources AI models do cite are ones you actually control.”This post unpacks the Yext findings, distinguishes between AEO, GEO, AIO, SEO and E-E-A-T as practical disciplines, and outlines a citation-first playbook for law firms. For background on how these disciplines relate to one another, see our GEO vs SEO comparison guide and our explainer on Answer Engine Optimization for law firms.

🔍 What Yext Actually Found

On October 9, 2025, Yext (NYSE: YEXT) published a study analyzing 6.8 million AI citations collected from 1.6 million queries per model across ChatGPT (OpenAI), Gemini (Google), and Perplexity between July 1 and August 31, 2025. Queries tested four intent quadrants — branded/unbranded crossed with objective/subjective — across four industries: retail, financial services, healthcare and food service. The dataset identified 20,820 unique citation domains overall.

The 86% Headline

Across all industries tested, Yext reported that 86% of AI citations came from sources brands already control or manage — primarily first-party websites, business listings, and brand-managed reviews/social properties. Forums such as Reddit accounted for just 2% of citations once location context and query intent were factored in (Yext Research, October 9, 2025).

The breakdown of citation sources across the full dataset:

  • First-party websites: 2.9M citations (44%)
  • Listings: 2.9M citations (42%)
  • Reviews / social content: 545K citations (8%)
  • Forums (Reddit and similar): ~2%
  • Other / long-tail: remainder

Source: Yext Research, “86% of AI Citations Come from Brand-Managed Sources,” October 9, 2025.

How Each Model Sources Differently

One of the more useful findings — and one that complicates any single “AI search” strategy — is that the three models have distinct sourcing preferences. Per Yext (October 9, 2025):

  • Gemini (Google): 52.1% of citations came from brand-owned websites — behavior consistent with its grounding in Google Search.
  • ChatGPT (OpenAI): 48.7% of citations came from listings, suggesting heavier reliance on directory and third-party data layers.
  • Perplexity: Diversified across niche directories — MapQuest and TripAdvisor were called out specifically — with stable behavior across sectors.

This matters because it means optimizing only for ChatGPT (the most-discussed model in 2024–2025) risks invisibility in Gemini, and vice versa. Our ChatGPT optimization guide, Google Gemini optimization guide, and Perplexity AI optimization guide each address the platform-specific signals in more depth.

The “Just 2% Reddit” Caveat

The headline that drew the most social commentary — including the post that prompted this article — was that Reddit and similar forums accounted for only about 2% of citations once location context and query intent were applied. That figure should be read carefully.

⚠️ Limitations to keep in mind:

Yext’s study tested retail, financial services, healthcare and food service — not legal services. Queries were structured around four intent quadrants and gathered via Yext’s Scout platform. B2B and complex high-consideration queries (the bulk of attorney searches) were not specifically isolated. The 2% Reddit figure is averaged across all tested industries and intent types; it does not mean Reddit citations are uniformly 2% for every query, every industry, or every model. Independent studies using different query sets have reported higher forum-citation shares for certain query types (Yext Research methodology disclosure, October 9, 2025).

⚖️ Translating the Findings: AEO, GEO, AIO, SEO & E-E-A-T

Before reading the Yext data as either a vindication of SEO or a dismissal of AEO, it helps to define each discipline precisely. These acronyms are often used interchangeably; they should not be.

SEO — The Foundation

Search Engine Optimization (SEO) is the discipline of earning visibility in traditional ranked search results — Google and Bing’s blue links, local 3-pack, and organic SERPs. Its inputs (crawlable architecture, page experience, authoritative content, link signals, structured data) overlap heavily with what AI models need to retrieve, parse and cite information.

The Yext study reinforces, rather than displaces, SEO. When 44% of AI citations come from first-party websites, the implication is that the same crawlability, content quality and schema work that drives traditional rankings is also feeding AI answers. Our law firm SEO services and local SEO program are built around this overlap.

AEO — Answer Engine Optimization

Answer Engine Optimization (AEO) targets visibility in direct-answer surfaces: Google’s featured snippets, “People Also Ask,” voice assistants, and the synthesized answers AI tools generate before linking out. AEO tactics emphasize Q&A formatting, clear definitions, concise direct-answer paragraphs, FAQ schema, and entity clarity.

The Yext finding that 86% of citations come from brand-managed sources doesn’t invalidate AEO — it specifies where AEO effort should be applied. AEO best practices like structured Q&A pages and clear topical answers should live primarily on the firm’s own site, not be outsourced to forum posts hoping for citation. For a deeper walkthrough, see our guide on Answer Engine Optimization for law firms in 2026.

GEO — Generative Engine Optimization

Generative Engine Optimization (GEO) is the discipline of optimizing content so it is more likely to be selected, quoted, and cited inside generative AI responses (ChatGPT, Gemini, Perplexity, Claude, Copilot). The term was formalized in peer-reviewed research presented at the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’24), Barcelona, Spain, August 25–29, 2024, where Aggarwal et al. demonstrated that specific content modifications — including the strategic use of quotations, statistics and citations — could improve generative engine visibility by up to 40% on tested benchmarks (Aggarwal et al., 2024, DOI: 10.1145/3637528.3671900).

The Yext data is fully compatible with GEO theory. If 86% of generative-engine citations are coming from sources brands already control, then the GEO levers — content design, citation density, schema, structured Q&A — are most effective when applied to the firm’s own pages. Our GEO for law firms pillar and GEO audit service operationalize this.

AIO — AI Optimization (and AI Overviews)

AI Optimization (AIO) is the umbrella term increasingly used to describe optimization across all AI-driven discovery surfaces — including Google’s AI Overviews (formerly Search Generative Experience), Microsoft Copilot answers, and embedded AI search inside other platforms. AIO subsumes AEO and GEO for many practitioners, though usage is not yet standardized.

For attorneys, the practical AIO question is: when a prospect asks “best personal injury lawyer near me” inside an AI surface, does the firm appear in the synthesized answer, and is the cited source one the firm controls? The Yext findings suggest that, more often than not, the cited source will be a website, listing, or review the firm could have managed. Our AI search optimization framework and LLM optimization for law firms guide cover this in detail.

E-E-A-T — The Underlying Quality Signal

E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is Google’s quality framework, originally articulated in its Search Quality Rater Guidelines. It is not a ranking algorithm directly, but a set of signals that human raters use to evaluate content quality — signals that are increasingly relevant to how AI systems evaluate which sources to cite, particularly in YMYL (Your Money or Your Life) categories like legal services.

The Yext findings are consistent with an E-E-A-T-aware retrieval logic: AI models appear to weight first-party authoritative content and structured directory data heavily — both signals that align with what Google’s framework values. For law firms, where YMYL standards apply with full force, E-E-A-T is not optional. See our E-E-A-T safety guardrails for law firm marketing for the operational checklist.

The takeaway: AEO, GEO, AIO and SEO are not competing strategies — they are layers of the same underlying citation economy. The Yext data tells law firms to invest those layers on assets they own, not chase visibility in surfaces they cannot directly influence.

🏛️ What This Means for Law Firms Specifically

The Yext study is the strongest evidence yet that brand-controlled assets drive AI visibility, but legal marketing has features that should adjust how the findings are applied.

1. Legal services were not in the dataset. Yext tested retail, financial services, healthcare and food service. Healthcare is the closest analog to legal in some respects — both are high-trust, regulated, locally-delivered services with directory ecosystems. In healthcare, Yext found that 52.6% of citations came from third-party directories like WebMD and Vitals, more than from first-party sites. For law firms, the analogous directories include Justia, Avvo, FindLaw, Martindale-Hubbell and Super Lawyers. The directional implication is that both first-party site quality and legal directory presence likely matter for AI citation visibility in this vertical.

2. Legal queries are often complex and high-consideration. Yext acknowledged that its query set tested four intent quadrants — branded/unbranded crossed with objective/subjective — but did not isolate B2B or long-form legal-style queries. A query like “what should I do after a rear-end accident in Los Angeles” generates a different citation pattern than “best Italian restaurant near me.” Practitioner observations suggest that AI models do pull from forum and Q&A content for some legal queries, particularly when the asker is in pre-consult research mode (Yext methodology disclosure, October 9, 2025).

3. Local intent is dominant in legal search. The Yext finding that “AI generates answers based on a person’s real-world location and context, not a generic brand view” maps directly onto legal client acquisition. Most consumer legal queries are local. This makes AI-powered local optimization more important for law firms than it might be for other verticals.

4. Compliance and review constraints affect what firms can publish. Bar association advertising rules limit how testimonials, results, and comparative claims can be presented — even when those signals are exactly what AI models tend to surface. Law firm AEO and GEO strategy must work within those constraints. A general-industry citation playbook will need adjustment before it is deployed for an attorney.

⚠️ Limitations:

No public study to date has analyzed AI citation patterns for legal services at the scale of the Yext analysis. Recommendations below are derived by combining Yext’s general-industry findings, the healthcare analog within that study, the GEO research published at KDD ’24, and InterCore’s practitioner observations across law firm engagements. As legal-specific data becomes available, this playbook should be revisited.

📋 A Citation-First Playbook for Law Firms

The most defensible reading of the Yext data, combined with the GEO research from KDD ’24, is that law firms should invest first in the four asset classes AI models cite most often — and apply AEO/GEO best practices on top of those assets, not separately from them.

1. First-Party Site Architecture

Because first-party websites generated the largest single share of citations across all tested industries (44% per Yext, October 9, 2025), the firm’s own site is the highest-leverage AI visibility asset. Practical steps include practice-area pillar pages with clear topical scope, location pages tied to a hub-and-spoke architecture, FAQ pages with FAQPage schema, and direct-answer paragraphs (30–50 words) at the top of every important page. Our 10 SEO rules rewritten for the AI search era walks through the specifics.

2. Local Listings & NAP Consistency

Listings accounted for 42% of citations in the Yext dataset — nearly tied with websites — and OpenAI’s ChatGPT leaned on listings most heavily (48.7%). For law firms, that translates to disciplined Google Business Profile management, accurate NAP (name, address, phone) data across legal directories (Justia, Avvo, FindLaw, Martindale-Hubbell, Super Lawyers), and structured local landing pages for every physical office. AI-powered local optimization is where most of this work happens.

3. Reviews & Reputation Signals

Reviews and brand-managed social content drove 8% of citations across the Yext dataset — modest in aggregate, but concentrated in subjective and branded queries (and 13.3% in food service specifically). For law firms, this means review velocity, response quality, and reputation across Google, Avvo and Yelp matter — bounded, of course, by state bar advertising rules on testimonials and endorsements. Sustained review programs that comply with ABA Model Rule 7.1 and state-specific variants should be a standing component of any AI visibility plan.

4. Schema & Structured Data

None of the above works at full effect without machine-readable structure. Schema.org markup — LegalService, LocalBusiness, Attorney, FAQPage, Article, and BreadcrumbList at minimum — is how AI retrieval systems disambiguate entities, locations, and topics. Google’s own Search Central documentation outlines the canonical entity types and validation tools. Skipping schema is the most common reason law firm sites underperform on AI citation tests despite strong content.

5. Citation-Worthy Content Design (the GEO Layer)

On top of these four asset classes, the Aggarwal et al. (KDD ’24) research demonstrates that content design choices — inline citation density, statistic anchoring, quotation use, and authoritative source linking — measurably increase the probability that a generative engine selects a page as a citation source. This is the GEO layer. It applies on top of strong SEO and AEO foundations, not as a replacement.

📊 Example Measurement Framework

Yext’s own research noted that 64% of marketing leaders said they were unsure how to measure success in AI search (Yext, October 9, 2025). A practical framework for law firms looks like this:

  1. Baseline documentation: Before implementation, run 20–50 representative queries across ChatGPT, Gemini, Perplexity, Google AI Overviews, and Microsoft Copilot. Capture which sources are cited, whether the firm is mentioned, and whether the mention is accurate.
  2. Query set definition: Build the query set from practice areas served, locations covered, and common pre-consult research patterns (e.g., “do I need a lawyer for [X],” “what is the deadline to file [Y] in [State]”).
  3. Measurement cadence: Re-test the same query set monthly. AI model behavior shifts; a one-time audit is not enough.
  4. Reporting metrics: Track four numbers — citation rate (the firm cited at all), mention rate (firm named even if not linked), accuracy rate (cited information is correct), and competitor comparison (which competing firms appear).
  5. Source attribution: When the firm is cited, log which asset was the source — the website, a directory listing, a review platform, a third-party article. Use this to direct the next quarter’s investment.

Our GEO audit for law firms productizes this baseline process, and the citation revolution analysis covers the broader strategic shift.

❓ Frequently Asked Questions

Does the Yext study mean law firms should stop investing in AEO and GEO?

No. The Yext study suggests that the assets AEO and GEO target — first-party pages, structured Q&A, citation-rich content — are exactly where AI models look for answers. The finding reframes AEO and GEO around brand-controlled surfaces; it does not invalidate the disciplines. Per Yext (October 9, 2025), 86% of citations came from sources brands can manage, which is an argument for investing in those surfaces with AEO/GEO best practices, not for abandoning them.

Was legal services included in the Yext analysis?

No. Yext tested retail, financial services, healthcare and food service. Legal services were not part of the dataset. The closest analog is healthcare, where directories like WebMD and Vitals drove 52.6% of citations — a finding that suggests legal directories (Justia, Avvo, FindLaw, Martindale-Hubbell) likely matter substantially for AI visibility in the legal vertical, though no public dataset confirms the exact share (Yext Research, October 9, 2025).

What is the difference between AEO, GEO, AIO and SEO?

SEO targets ranked search results in Google and Bing. AEO (Answer Engine Optimization) targets direct-answer surfaces like featured snippets and voice assistants. GEO (Generative Engine Optimization) targets citation and selection inside generative AI responses (ChatGPT, Gemini, Perplexity, Claude, Copilot) and was formalized in research at KDD ’24. AIO (AI Optimization) is an umbrella term covering AEO and GEO plus AI Overviews and embedded AI search. The disciplines layer, they do not compete. See our GEO vs SEO comparison guide for a fuller side-by-side.

How does E-E-A-T fit into AI citation strategy?

E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is Google’s quality framework from its Search Quality Rater Guidelines. While not a direct ranking signal, the same content properties E-E-A-T values — verifiable expertise, clear authorship, authoritative sourcing, accurate information — appear to be exactly what AI retrieval systems weight when selecting citation sources, particularly in YMYL categories like legal services. For attorneys, see our E-E-A-T safety guardrails for the operational checklist.

If 86% of citations come from owned sources, why do different AI models cite differently?

The 86% figure is the aggregate share of citations from brand-controlled sources across all three models. Within that, the mix varies: Gemini drew 52.1% of its citations from first-party websites, ChatGPT drew 48.7% from listings, and Perplexity diversified across niche directories (Yext Research, October 9, 2025). The practical implication is that a firm cannot optimize for one model and expect visibility in the others; the underlying assets must be strong across all three categories (site, listings, reviews) for AI visibility to be model-agnostic.

How should a law firm measure whether AI search investments are working?

Define a representative query set (20–50 queries tied to practice areas and locations), document a baseline citation profile across ChatGPT, Gemini, Perplexity, Google AI Overviews and Copilot, then re-test monthly. Track citation rate, mention rate, accuracy rate, and competitor comparison. Yext’s own research noted that 64% of marketing leaders said they were unsure how to measure AI search success (Yext, October 9, 2025), which makes establishing a baseline early one of the highest-leverage moves a firm can make.

Find Out Where Your Firm Stands in AI Search

InterCore Technologies has built AI-driven marketing systems for law firms since 2002 — 23+ years before “GEO” had a name. Our team will baseline your firm’s citation profile across ChatGPT, Gemini, Perplexity, Google AI Overviews and Copilot, and build a citation-first plan around the assets you control.

Request Your Free AI Visibility Audit →

InterCore Technologies
📞 (213) 282-3001 · ✉️ sales@intercore.net
📍 13428 Maxella Ave, Marina Del Rey, CA 90292

📚 References

  1. Yext, Inc. (October 9, 2025). “Yext Research: 86% of AI Citations Come from Brand-Managed Sources, Clarifying How Marketers Can Compete in the AI Search Era.” Press release and methodology disclosure. Study analyzed 6.8 million AI citations from 1.6 million queries per model across ChatGPT (OpenAI), Gemini (Google), and Perplexity between July 1 and August 31, 2025. URL: https://www.yext.com/about/news-media/ai-citations-release
  2. Yext, Inc. (October 9, 2025). “AI Doesn’t Rank, It Cites. And 86% of Its Sources Are Brand-Managed.” Yext Blog. URL: https://www.yext.com/blog/2025/10/ai-citations-86-percent-of-sources-are-brand-managed
  3. Yext Research. “AI Citations: User Locations & Query Context.” Full research report. URL: https://www.yext.com/research/article/ai-citations-user-locations-query-context
  4. Goodwin, D. (October 9, 2025). “AI search relies on brand-controlled sources, not Reddit: Report.” Search Engine Land. URL: https://searchengineland.com/ai-search-citations-brand-controlled-sources-463166
  5. Aggarwal, P., Murahari, V., Rajpurohit, T., Kalyan, A., Narasimhan, K., & Deshpande, A. (2024). “GEO: Generative Engine Optimization.” In Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’24), Barcelona, Spain, August 25–29, 2024, pp. 5–16. DOI: 10.1145/3637528.3671900
  6. Google Search Central. “Introduction to structured data markup in Google Search.” Developer documentation. URL: https://developers.google.com/search/docs/appearance/structured-data/intro-structured-data
  7. Google. “Search Quality Evaluator Guidelines” (E-E-A-T framework). Most recent public version. URL: https://services.google.com/fh/files/misc/hsw-sqrg.pdf
  8. Wills, B. (2025). LinkedIn post on AEO/GEO/SEO and Yext research interpretation. URL: https://www.linkedin.com/in/benwills/
  9. Yext. “How ChatGPT, Perplexity, Gemini, and Claude Actually Decide What to Cite.” Follow-on Yext Research analysis of 17.2M citations. URL: https://www.yext.com/blog/how-chatgpt-perplexity-gemini-claude-decide-what-to-cite

Conclusion

The Yext study does not declare AEO and GEO dead — it specifies where those disciplines pay off. When 86% of AI citations across 6.8 million data points come from brand-controlled surfaces, the strategic conclusion is not “do less AEO” but “make sure the assets you already own are doing the work AEO, GEO, AIO and SEO are designed to do.” For law firms, that means strong first-party site architecture, disciplined listings management, compliant review programs, and rigorous schema — all layered with the citation-density and content-design tactics that the KDD ’24 GEO research demonstrated can move the needle.

The harder question is measurement. Yext’s own research found that 64% of marketing leaders are unsure how to gauge AI search success. A firm that baselines its citation profile now, defines a target query set, and re-tests monthly will be ahead of competitors who are still arguing about whether AEO is “real.” For a practical starting point, explore our GEO services for law firms, the InterCore legal marketing hub, or schedule a conversation with our team via the contact page.

The citation economy is changing fast, but the underlying lesson from Yext is reassuring for firms that have done the foundational work: control the surface, control the citation.

About the Author

Scott Wiseman is CEO & Founder of InterCore Technologies, an AI-powered legal marketing agency founded in 2002. Scott has 23+ years of AI development experience applied to law firm marketing, and leads InterCore’s GEO, AEO and AI search optimization practice across 35 offices and 24+ U.S. states.

Published: May 12, 2026 · Last updated: May 12, 2026 · Reading time: ~14 minutes

📅 Book Now Free AI Audit