The digital marketing landscape is undergoing a fundamental transformation. As generative AI tools like ChatGPT, Google's AI Overview (AIO), AI Mode, Perplexity and Microsoft Copilot become more prominent, traditional keyword-driven search is giving way to concept-centric strategies powered by artificial intelligence. These AI-powered experiences don't just return results, they anticipate, summarise, and even act on users' needs.

Image Source: https://arxiv.org/html/2311.09735v2

For marketers, this evolution expands the challenge: It’s no longer simply about ranking well on Google; it’s about being visible wherever decisions are made. Success in this new landscape requires rethinking Search Engine Optimisation (SEO) through the lens of machine comprehension, concepts, and authoritative entities. Our quarterly survey, SearchPulse, is designed to meet this challenge, offering businesses the crucial insights needed to stand out by tracking the user behaviour shift of their target demographic.

Understanding the Fundamental Shift: From Certainty to Probability

Traditional search was certain content that came out of the system the same way it went in. SEOs optimised for specific keywords, and the results were mostly predictable. This model was built on classic information retrieval principles where visibility followed clear rules.

Generative search operates on a different foundation. Powered by Large Language Models (LLMs), these systems are predictive technologies trained on massive datasets to generate language by anticipating the next word or "token" in a sequence. Unlike traditional search, visibility in this environment is probabilistic. Content is manipulated and synthesised in ways marketers cannot fully predict or control, making it impossible to know exactly how or if their content will appear in responses.

Google's AI Mode is a great example of complexity. Built on reasoning models, it uses a multi-phase system that maintains persistent user context, tracking prior queries, locations, and behavioural signals. Each interaction is converted into a vector embedding, allowing the system to reason about intent over time rather than treating each query in isolation.

The New Approach: Relevance Engineering and Zero-Click Success

This technological shift fundamentally changes both user behaviour and success metrics. The user experience evolves from Google offering options to AI providing recommendations. The AI becomes a decision support tool, guiding users toward answers rather than simply presenting choices.

This transformation necessitates abandoning the click as the sole metric of success. AI Overviews and AI Mode actively promote zero-click behaviour users get their answers without leaving the AI interface. In this environment, as Mike King wrote, "being cited matters more than being clicked". The goal shifts from driving traffic to being chosen as a source by the LLM itself.

Mike King suggests this represents "what comes after SEO", proposing a new discipline called Relevance Engineering (r19g). This approach focuses on "engineering relevance to penetrate systems of reasoning across an array of queries" by optimising content to maintain relevance across a matrix of synthetic queries.

Understanding how AI Mode selects content is crucial. A key mechanism is query fan-out where the LLM reformulates the original query into dozens or hundreds of related, implicit, and personalised synthetic subqueries. This makes ranking in AI Mode a "complex matrixed event". Content selection depends on how well a document or even an individual passage aligns semantically with one or more of these hidden fan-out queries.

Tools like AlsoAsked provides a visualisation of how an entity expands into semantically related long-tail queries.

Entity SEO, moving beyond keyword strings to focus on actual entities. An entity is a clearly defined 'thing' recognised in a knowledge graph: a person, place, thing, idea, brand, or product line. These entities don't exist in isolation; they relate to one another, forming networks of connections that search engines can understand and interpret.

When content thoroughly covers a topic by addressing related entities and their relationships, it signals authority to search engines. Entity SEO involves several core techniques for generating machine-readable, semantically rich content:

1. Semantic Content Depth

Develop content that provides genuine depth and context. When referencing an entity, provide enough background to help search engines confidently identify what's being discussed. This includes explaining "the reasons behind it, how it works, and what makes it important". Surface-level coverage no longer suffices; machines need comprehensive context to understand relevance.

2. Structuring Content for Agents

LLMs favor content that is well-structured and semantically rich. Content needs to be modular and extractable, utilising scannable, modular formats such as lists, bullet points, and headings. Clear question-answer formatting can significantly increase the chances of AIO inclusion. The goal is to make content "Composition-Friendly" easy for AI systems to parse, extract, and recombine.

3. Passage-Level Relevance

Here's a critical insight: retrieval happens at the passage level, not the page level. AI systems extract specific segments that answer particular subqueries generated during query fan-out. Each passage must be "semantically complete in isolation" capable of answering or contextualising a specific question without requiring surrounding context. Content must be clear, concise, and free of redundancy to survive LLM performance evaluation.

4. Content Credibility and Trust

Despite technological changes, relevance and trust remain foundational. AI Search favours brands that demonstrate E-E-A-T (Expertise, Experience, Authoritativeness, Trustworthiness). The data supports this priority:

The lesson is clear: factual, attributable, and verifiable content is necessary to be worthy of citation.

5. Schema Markup: Strategic, Not Essential

Structured data via schema markup helps signal entities to search engines and remains important for organic search. However, its role in generative AI citation may be overstated. While 81% of cited web pages included schema markup in a study done by AccuraCast, the research suggests schema "seems to matter to a very limited extent as a ranking, visibility, or citation factor".

Person schema is notably popular (58.9% of cited sources), which aligns with the emphasis on authoritative authors and quality sources. That said, simple semantic HTML markup (e.g.,

  • ...

for lists) can achieve similar results to specific list schema types. Use schema strategically, but don't expect it alone to drive AI citations.

 

Expanding Beyond Your Website: The Distributed Content Footprint

LLMs don't limit themselves to brand websites. They ingest content from across the public web, which means your content strategy must extend far beyond your owned properties.

User behaviour is already reflecting this reality. For younger generations, platforms like TikTok and Reddit increasingly bypass traditional search engines for inspiration and discovery. As we identified in our SearchPulse “People aged 18-44 use an average of five search platforms, but those aged 45+ only use two or three”. 

Reddit commands strong loyalty among Millennials and Gen Z as a trusted resource for recommendations based on social proof. Social proof is a human behaviour where we’re more inclined to follow others’ actions, particularly when we’re uncertain. We assume their actions are correct and conform to them.  TikTok's Nearby feed and experimental user-generated reviews support local discovery in novel ways. Meanwhile, Perplexity AI has emerged as a major alternative search platform, seeing remarkable growth in adoption.

To reach these diverse audiences and appear in these varied contexts, marketers must tailor content strategically:

Conclusion: Coordination, Not Just Optimisation

The shift from keywords to concepts revolutionises how users discover brands and make decisions. Organic Search is being reframed as a visibility and trust channel shaped by Large Language Models, shifting the traditional balance between performance marketing and brand building.

Success in this new era requires a shift in mindset, moving from individual optimisation tactics to a more coordinated strategy across queries, content formats, and embeddings. Marketers need to focus on spreading their content beyond their own sites, optimising for user intent, and ensuring their platforms are ready for AI systems, while continuing to build trust, relevance, and authority.

The future of search isn't about beating algorithms; it's about earning the trust of intelligent systems that increasingly act as gatekeepers to consumer attention. Those who adapt will find new opportunities for visibility; those who cling to keyword-era tactics risk becoming invisible in an AI-mediated world.

Are you ready to rethink your search strategy for the future of search? Get in touch, and we’ll be happy to help.

Contact Us
yordan_chair

MEET THE
AUTHOR.

YORDAN DIMITROV

Yordan drives data-led SEO strategies at Reflect Digital, leveraging his previous experience across sectors such as leisure, e-commerce, and automotive to enhance visibility and ROI for clients. He is passionate about delivering measurable growth and combines technical expertise with creative insight to identify new ranking and traffic opportunities. Yordan’s aim is to ensure seamless project delivery by managing timelines, coordinating with teams across departments, and staying at the forefront of SEO trends to achieve exceptional results for clients.

More about Yordan
hellofresh-logo2
brakes_logo.svg
sunsail
uktv
nidostudent
rspca_logo

Have a project you would like to discuss?