Tech for Retail 2025 Workshop: From SEO to GEO – Gaining Visibility in the Era of Generative Engines

Back to blog

Google AI Overviews: SEO Impact and Strategies

GEO

Discover Incremys

The 360° Next Gen SEO Platform

Request a demo
Last updated on

1/4/2026

Chapter 01

Example H2
Example H3
Example H4
Example H5
Example H6

SEO in the Era of Google AI Overviews: Understand the Shift and Protect Your Visibility

 

 

Introduction: putting AI Overviews into the wider context of the ai search engine

 

If you have already framed the broader topic of the ai search engine, the next step is to get specific: what Google AI Overviews are, and what they mean operationally for your SEO.

This article focuses on how they behave in the SERP, the measurable impacts (CTR, clicks, conversions), the selection criteria behind cited sources, and practical optimisation tactics, without rehashing the fundamentals already covered elsewhere.

 

Why this is becoming a B2B priority: fewer "easy" clicks, tougher competition for being cited

 

AI Overviews can take up significant above-the-fold space and may satisfy intent without a site visit, increasing pressure on "easy" organic clicks (definitions, basic explanations, checklists). Some sources explicitly describe the risk of "not a single click" when the answer is considered complete within the SERP (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/).

At the same time, visibility is no longer only about ranking: your ability to be used and cited becomes a new competitive battleground. This is precisely the territory of GEO (visibility in generative AI engines). It does not replace SEO; it extends it into citability and attribution.

 

How Google AI Overviews Work in the SERP

 

 

Triggering and composition: generated summary, cited sources, links and intent-driven variations

 

AI Overviews (previously referred to as SGE in some markets and communications) display an AI-generated summary at the top of search results, with links to pages that support the response (Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). Google presents them as a way to grasp the essentials quickly, then explore websites to go deeper (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

At a systems level, Google indicates that these features may launch multiple related searches across subtopics (a "query fan-out" approach) to build a more robust answer and surface a potentially broader set of links (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

In practice, the AI Overview combines information from different sources (e.g., Knowledge Graph, databases, Google products, websites, UGC), then produces a concise snapshot (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/). The experience can evolve towards conversational answers and follow-up questions without leaving the SERP, which can increase time spent "inside Google" (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/).

 

What Google says officially: SEO guidance for AI Overviews, limitations, errors and key watch-outs

 

Google is explicit on one point: standard SEO best practice remains relevant, and there is no additional technical requirement or special "AI markup" needed to appear in these features (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). Your pages mainly need to be indexed and eligible to show snippets, whilst meeting core technical requirements for Search.

Google also reiterates there is no guarantee: even if you are compliant, it does not promise crawling, indexing or visibility. For control, the usual mechanisms (nosnippet, data-nosnippet, max-snippet, noindex) remain the reference for limiting what can be used in Search, including in AI features (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

Finally, Google notes that Search Console counts these clicks within overall "Web" search traffic, without necessarily offering a universal breakdown specific to AI Overviews. Measurement therefore often requires proxies (segmentation, query sets, before/after comparisons, SERP checks and log analysis) rather than relying on a single "AI Overviews" filter (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

From classic Search to generative answers: where visibility is won (GEO) without undermining your SEO

 

SEO still provides the foundation (crawl, indexation, relevance, performance, architecture), but the goal broadens: increase the likelihood of being a source that a generative engine can reliably use. In other words, you optimise for both access (SEO) and extractability/verifiability (GEO).

The right stance is to avoid two extremes: (1) rewriting everything "for AI" and losing user intent, or (2) sticking to a pure ranking mindset and ignoring how attention is being redistributed. The aim is to make content both more useful to people and easier to cite (structure, evidence, neutrality, clarity).

To understand the SGE → AI Overviews transition in more depth without repeating it here, you can read Google SGE SEO.

 

Measurable SEO Impact: how AI Overviews affect organic clicks, CTR and where value moves

 

 

How AI Overviews affect SEO in practice: attention, journeys and query behaviour

 

The core mechanism is straightforward: if the SERP answers the question, fewer people click. AI Overviews take up space, pull attention, and sometimes offer a conversational path that delays or replaces the click (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/).

A second mechanism: an AI Overview does not extract "one page"; it extracts reusable elements (facts, definitions, steps). This favours content that is structured into independent, precise and verifiable blocks, rather than a single long, uniform narrative (Vu du Web: https://www.vu-du-web.com/seo/referencement-ai-overview-google/).

 

Understanding "zero-click" effects without overinterpreting: what drops, what shifts, what remains clickable

 

Market-level figures suggest a large share of searches end without a click, and that the presence of an AI Overview can sharply reduce CTR for the number-one result (see GEO statistics). That does not mean "SEO is dead". It means value shifts towards visibility, trust and more qualified traffic.

On the research side, Ahrefs observed (April 2025) an average 34.5% drop in CTR for the number-one position across 300,000 informational queries when AI Overviews were present (reported by Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). The same source also notes a year-on-year CTR change: 7.3% (March 2024) → 2.6% (March 2025) for queries with AI Overviews, versus 5.6% → 3.1% without AI Overviews.

Some analyses also suggest redistribution: pages ranking lower (positions 3 to 10) can sometimes gain traffic if they are included as cited sources, whilst the top three can lose out when the AI Overview captures attention (Terakeet Engineering, February 2025, cited by Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html).

 

Read it by page type: information, consideration, comparison and decision (specific B2B implications)

 

In B2B, impact varies strongly by intent. Pure informational content (define, explain, list) is most exposed, because it is easy to compress into a synthetic answer (Seomix: https://www.seomix.fr/ai-overviews-impact-seo/).

By contrast, consideration and decision pages (detailed comparisons, methodologies, studies, edge cases, implementation constraints) tend to be more resilient because users need proof, context and a decision framework. This is also where citations can add disproportionate value by acting as "validation" in the buying journey.

B2B page type "Zero-click" risk Top optimisation angle
Definition / beginner guide High Chunking + evidence + routes to advanced resources
Comparison / shortlist Medium Tables, measurable criteria, assumptions, limitations
Study / benchmark Low to medium Original data, methodology, sources, downloadable asset
Solution / product page Variable Proof, use cases, FAQ, terms, reassurance

 

Impact on rankings: what changes (and what does not) between classic rankings and being cited in an AI Overview

 

Organic rankings remain important, because AI Overviews rely on eligible, indexed web sources (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). A weak SEO foundation therefore mechanically reduces your chances of being selected.

But "winning" is no longer just position one. According to data shared in an Incremys GEO guide, "99% of AI Overviews cite pages from the organic top 10". That implies that reaching the top 10 on strategic queries, then working on citability, can be more profitable than focusing solely on gaining a single position.

 

Selection Criteria: Why Some Content Gets Cited and Other Pages Are Ignored

 

 

Quality signals: usefulness, accuracy, freshness, structure and entity consistency

 

Content that gets cited is typically easier to extract reliably: clear answers, precise wording, factual elements and well-structured sections. The goal is not to be "short", but to make each block understandable and reusable on its own (Vu du Web: https://www.vu-du-web.com/seo/referencement-ai-overview-google/).

Google also recommends practical fundamentals: keep important content available in text, deliver a strong page experience, and add high-quality images/videos where relevant (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). Freshness matters too, especially where information changes (pricing, terms, versions, dates).

 

Trust signals: E-E-A-T, evidence, editorial transparency and traceable claims

 

The more easily a claim can be cross-checked, the more likely it is to be reused. Citability-oriented recommendations emphasise the trio of information + evidence + source, as well as neutrality (avoiding overly marketing-driven or subjective phrasing) (Vu du Web: https://www.vu-du-web.com/seo/referencement-ai-overview-google/).

In the same spirit, strong E-E-A-T (quality, expertise, reliability, usefulness) increases the likelihood of being used as a source (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/). In B2B, editorial transparency (named author, update date, methodology) becomes a safeguard against approximation.

 

Technical prerequisites: accessibility, indexability, rendering, performance and structured data

 

No crawling and indexation means no citations. Google encourages checking robots.txt, hosting/CDN restrictions, internal linking accessibility and correct rendering for Googlebot (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

For structured data, Google states there is no special "AI Overviews" markup requirement, but advises ensuring structured data aligns with visible page content (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). In practice, useful schemas (Article, FAQPage, HowTo, Organization, etc.) help clarify entities, attribution and extractable elements for machines, without offering automatic guarantees.

 

Editorial criteria with high impact: verifiable answers, consensus, nuance and handling YMYL topics

 

AI Overviews tend to appear most often on queries that are easy to summarise: simple, informational searches where a synthesis can satisfy the user (Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). On more complex topics, there may be more room for clicks because the overview cannot always do enough.

For sensitive topics (YMYL), nuance and framing are critical: state assumptions, provide context, avoid shortcuts and point to official sources when needed. Generative AI is risk-averse around uncontrolled ambiguity, and is more likely to select passages that are consensus-based, sourced and non-promotional.

 

Optimising for Google AI Overviews: Content Optimisation for Generative Answers

 

 

Structure for reuse: direct answers, short sections, definitions, lists and tables

 

Your aim is to make each section independently citable. Start sections with a complete answer sentence, then expand with verifiable elements, and finish with an action or a forward link (method, checklist, example).

  • Break content into short blocks (one idea per paragraph) with explicit headings.
  • Turn enumerations into lists and comparisons into tables.
  • Add concrete anchors (dates, scope, definitions, prerequisites, limitations).

 

Content formats that work for generative answers: Q&A, step-by-step, comparisons and decision-driven glossaries

 

Formats that are easy to extract tend to win: Q&A, procedures, decision grids and glossaries. They naturally match conversational queries and reduce interpretation effort.

  1. Q&A: question phrased like a real search, answer in 2–3 sentences, then detail.
  2. How-to: numbered steps, validation criteria and common pitfalls.
  3. Comparison: measurable criteria (no superlatives), assumptions and use cases.
  4. Glossary: concise definition + context + example + internal links to related concepts.

 

Boost credibility: quotes, sources, methodology, author and "proof" pages

 

Every important statement should be defensible. If you use a statistic, include its source, date and context; otherwise you weaken verifiability and, by extension, citability.

A useful benchmark for decision-making: an Ahrefs study (April 2025) observed an average 34.5% drop in CTR for the number-one position on informational queries when an AI Overview is present (Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). At a macro level, data published by Incremys indicates 60% of searches end without a click and that the number-one position can see a 2.6% CTR when an AI Overview is present (Squid Impact, 2025, via GEO statistics).

To widen the lens on generative engines beyond Google (usage distribution, behaviour patterns, trends), you can also consult the LLM statistics.

 

Useful semantic coverage: complementary angles, conversational long-tail and content consolidation

 

To reduce the "summary is enough" effect, invest in angles an AI Overview cannot compress without losing value: decision frameworks, real-world constraints, matrices, experience-based learnings and frequent mistakes. This both increases click propensity and improves traffic quality.

Also work on conversational long-tail: follow-up questions, objections, edge cases and persona-based variants. The more your content anticipates sub-questions, the more entry points you create without publishing redundant pages.

 

On-page without over-optimising: headings, snippets, internal linking and templates

 

When an AI Overview is present, the classic snippet can matter less, but it still influences clicks when users are undecided. Optimise titles for clarity (benefit + scope), and use introductions to highlight unique value (what a generic summary cannot provide).

Internal linking becomes a consolidation lever: connect pillar pages, proof pages, glossaries and comparisons. The goal is twofold: help Googlebot navigate and guide the user who clicks after reading the overview.

 

Off-site and brand signals: popularity, mentions, consistency and multi-channel alignment

 

Generative systems also lean on external signals (mentions, reputation, reference sources). Without resorting to third-party tools, the principle is simple: the more your brand and experts are cited in credible environments, the more your content becomes attributable and reassuring.

If your visibility strategy includes answer-first engines beyond Google, you can also explore Perplexity SEO.

 

Measurement and Steering: Track Impact and Make Faster Decisions

 

 

Build a before/after baseline: segments, queries, pages and intent

 

Without a baseline, you will never separate "AI Overview impact" from seasonality or competitive pressure. Keep segmentation simple: informational vs transactional queries, top impacted pages, and query groups where you observe an AI Overview (SERP checks).

Segment Examples Why it matters
Intent information / consideration / decision Compare different CTR dynamics
Risk level easy-to-summarise vs complex queries Prioritise which pages to harden first
Top pages top 20 B2B SEO pages See where value really shifts

 

How to detect and monitor AI Overviews using Google Search Console, Google Analytics and log files

 

Google states that traffic from AI features is included in Search Console (search type: "Web") (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). In practice, you need to triangulate using trends, segments and observation.

  • Google Search Console: track impressions, clicks, CTR and average position by query and page, then isolate clusters where CTR drops whilst positions remain stable.
  • Google Analytics: measure post-click quality (engagement, depth, conversions), because Google claims clicks from SERPs with AI Overviews are "higher quality" (without figures) (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).
  • Logs: monitor crawling (Googlebot), revisit frequency for strategic pages, and crawl changes after updates (useful when you modify structure, markup or snippet controls).

 

Build a business-first dashboard: visibility → clicks → conversion → pipeline

 

Good reporting can no longer stop at rankings. Manage a short chain from attention to value: visibility (impressions) → access (CTR/clicks) → engagement → conversion → pipeline.

  • Visibility: GSC impressions, share of strategic queries in the top 10.
  • Access: clicks and CTR by intent, pages with falling CTR despite stable rankings.
  • Value: conversion rate, qualified leads, pipeline contribution (GA).

 

Set the cadence: weekly (alerts), monthly (trade-offs), quarterly (recalibration)

 

Move fast without drowning in noise. Weekly for breaks in trend (CTR/clicks), monthly for trade-offs (refresh vs new content), quarterly for strategic recalibration (intent mix, formats, consolidation).

  1. Weekly: alerts for CTR drops at stable positions, high-impression pages losing clicks.
  2. Monthly: optimisation plan (structure, evidence, Q&A, tables) and prioritisation.
  3. Quarterly: cluster review, cannibalisation clean-up, proof-page strategy.

 

Governance and Trade-offs: Turn Uncertainty into a Plan

 

 

Prioritise workstreams: scenarios, risks, dependencies and quick wins

 

Treat this like a risk portfolio. Start with pages that combine (1) high impressions, (2) easily summarised intent, and (3) falling CTR, because they often finance a large share of organic acquisition.

  • Quick wins: reformat into chunks, add tables, strengthen evidence and methodology.
  • Structural work: information architecture, internal linking, proof pages, update governance.
  • Dependencies: indexation, performance, CMS templates, consistent structured data.

 

Editorial positioning: where to invest (refresh, new content, pillar pages) to maximise citability

 

Invest where an AI Overview cannot do the work for you: comparisons with measurable criteria, choice frameworks, methodologies, segmented use cases and proof pages (process, security, compliance, SLA, etc.). For high-risk guides, prioritise updates and deepening over duplicating near-identical pages.

A practical principle: content backed by data and concrete examples is more likely to be cited because it is easier to verify and attribute. That requires editorial discipline: source, date, scope and maintain.

 

Are AI Overviews "killing SEO"? Myth, reality and what to do when your CTR drops

 

Saying AI Overviews "kill" SEO is an oversimplification. Available research mainly indicates a redistribution of attention, particularly on informational queries (Ahrefs via Abondance), alongside rising zero-click behaviour (see GEO statistics).

Your action plan should target two outcomes: (1) protect clicks where they still matter, and (2) win citability where the SERP summarises. This becomes an ongoing steering exercise, not a one-off project.

 

What should you do if a page ranks well but no longer gets clicks because of AI Overviews?

 

Start by diagnosing "stable rankings, falling CTR" in Search Console, then audit the SERP for the affected queries (AI Overview present, type of answer, depth). Next, update the page to create a reason to click beyond the summary: quantified comparison, decision matrix, downloadable checklist, examples and limitations.

Then reposition the page within your internal linking: strengthen routes to a proof page or a decision-oriented asset that converts. The goal is not to recover 100% of lost CTR, but to protect conversion volume.

 

How do you avoid cannibalisation between similar pages when citability becomes the objective?

 

Give each page a single role: one core intent plus one distinct angle. Consolidate repeated content, and turn secondary pages into satellites (use cases, personas, objections) that link back to the pillar page.

Add differentiated Q&A blocks (not duplicated), page-specific examples and different proof points (a statistic, a source, a method). Generative systems find it easier to select clearly differentiated passages than near-identical pages.

 

Scaling SEO & GEO Monitoring with Incremys (single paragraph)

 

 

Connect Search Console and Google Analytics via API, then prioritise SEO/GEO actions with a 360° audit without stacking tools

 

For teams that want to structure this work without piling on extra solutions, Incremys centralises Google Search Console and Google Analytics via API and helps you prioritise via a 360° SEO & GEO audit (technical, content, opportunities, tracking). The value is primarily organisational: one workflow to spot CTR drops, identify pages exposed to AI Overviews, and plan citability-led optimisations at multi-site scale.

 

FAQ: Google AI Overviews and SEO

 

 

What are Google AI Overviews and how do they work in search results?

 

AI Overviews are AI-generated summaries shown at the top of the SERP to answer a query quickly, with links to source pages (Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). Google says they appear when its systems believe they add value compared with classic Search, and that they may run multiple "fan-out" searches across subtopics to build the response (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

How do AI Overviews affect SEO day to day (CTR, clicks and conversions)?

 

The most visible day-to-day effect is CTR decline on some queries, especially informational ones, because users can get an answer without clicking. Ahrefs observed an average 34.5% CTR drop for the number-one position when an AI Overview is present across 300,000 informational queries (April 2025, reported by Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html).

In B2B, conversions can remain stable if you compensate through (1) higher-quality clicks, (2) more differentiated decision-stage content, and (3) stronger capture of high-intent visitors. Google claims clicks from SERPs with AI Overviews are "higher quality" (without figures) (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

How are AI Overviews changing SEO and the distribution of clicks?

 

They shift value from a "list of links → click" model to a "answer → selection of sources" model. That reduces some organic clicks and redistributes attention towards cited websites, sometimes benefiting lower-ranked pages that are included as sources (Terakeet cited by Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html).

 

Which query types most often trigger AI Overviews in France?

 

Available sources mainly describe frequent triggering on simple, easily summarised queries (Abondance: https://www.abondance.com/20250425-1109940-google-ai-overviews-une-menace-pour-le-trafic-seo.html). Regarding rollout, several articles describe a gradual arrival in Europe and note that availability in France has not always been confirmed depending on publication dates, with preparation recommended (Abondance and Seomix: https://www.seomix.fr/ai-overviews-impact-seo/).

 

Which SEO guidelines for AI Overviews should you follow to stay compliant?

 

Google recommends sticking to core SEO fundamentals: allow crawling (robots.txt, CDN), make content accessible via internal links, provide a strong page experience, ensure key content is available as text, use high-quality media where relevant, and keep structured data consistent with what is visible (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

There is no mandatory markup specific to AI Overviews. To limit what is shown, use nosnippet, data-nosnippet, max-snippet or noindex, and validate via URL Inspection that Googlebot can see the directive (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

How do you measure the impact of AI Overviews on traffic, CTR and conversions?

 

First measure changes in CTR/clicks at comparable positions in Google Search Console, because clicks from AI features are included in overall Web search traffic (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). Then connect those changes to session quality and conversions in Google Analytics (conversion rate, pipeline contribution, engagement).

The strongest signal is often "CTR falling whilst rankings stay stable", especially on informational queries. Complement this with manual SERP checks on your most critical queries to confirm AI Overview presence.

 

How can you detect and monitor AI Overviews using Google Search Console and log files?

 

Search Console does not always provide a universal dedicated marker, so you need to work via correlation: query clusters, shifts in impressions/clicks/CTR and impacted pages (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features). In logs, monitor Googlebot crawl activity on strategic pages and ensure updates (content, structure, structured data) are being recrawled.

 

How do you optimise content to be cited in AI Overviews (content optimisation for generative answers)?

 

Write for extraction: short sections, one idea per block, definitions and direct answers in the first sentence, followed by evidence and sources. Turn comparisons into tables and procedures into numbered steps, because these formats are easier to cite (Vu du Web: https://www.vu-du-web.com/seo/referencement-ai-overview-google/).

Then reinforce reliability: author details, last-updated date, methodology, and links to recognised sources. Finally, consolidate internal linking to guide users towards decision-stage pages that convert.

 

Which trust and authority signals increase the likelihood of being included in AI Overviews?

 

The most actionable signals are editorial and structural: E-E-A-T, neutrality, precision, evidence and traceable claims (Digitaleo: https://blog.digitaleo.fr/google-ai-overview/). Citability-focused recommendations also highlight the ability to provide information that is cross-checkable and unambiguous (Vu du Web: https://www.vu-du-web.com/seo/referencement-ai-overview-google/).

Technically, ensure baseline eligibility (indexation, snippet capability) and that structured data aligns with visible content (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

How do you audit a website's technical structure to increase the chances of being used in AI Overviews?

 

  • Crawlability: robots.txt, CDN/hosting blocks, HTTP status codes, internal linking.
  • Indexability: accidental noindex, consistent canonicals, duplication, pagination.
  • Rendering: main content available in HTML (not only via scripts), Googlebot inspection.
  • Performance and UX: strong page experience, especially on mobile.
  • Structured data: validity and consistency with visible content (Article/BlogPosting, FAQPage, Organization, etc.).

This aligns with Google's official recommendations for maximising eligibility for AI features, without any "special" optimisation (Google Search Central: https://developers.google.com/search/docs/appearance/ai-features).

 

Which KPIs should you track to manage SEO ROI when AI Overviews capture attention?

 

  • Visibility: GSC impressions, share of strategic queries in the top 10.
  • Access: clicks, CTR and, crucially, "CTR at stable rankings" by intent.
  • Quality: post-click engagement (GA), conversion rate, pipeline contribution.
  • Resilience: reliance on informational pages vs decision-stage pages.

 

How do you build an SEO business case for AI Overviews for an executive committee?

 

Base the business case on risk and reallocation, not on volume promises. Use (1) your observed CTR declines (GSC), (2) published benchmarks (e.g., Ahrefs' CTR drop for position one when AI Overviews are present, reported by Abondance), and (3) business impact (leads, pipeline) measured in Analytics.

Suggested structure:

  1. Observation: impacted segments (queries, pages, intent) and before/after baseline.
  2. Risk: loss of "easy" informational clicks and increased competitive pressure.
  3. Plan: citability optimisation (structure, evidence, Q&A, tables) + decision-stage consolidation.
  4. Measurement: KPIs from visibility → clicks → conversions → pipeline, with monthly checkpoints.

 

Do AI Overviews "kill" SEO, or do they shift where value sits?

 

They mostly shift value. Available research shows CTR falling on informational queries when an AI Overview is present (Ahrefs via Abondance), and macro data points to higher zero-click behaviour (see GEO statistics). In return, being cited as a source and click quality become more decisive levers than ranking alone.

 

Which B2B pages benefit most from being cited in an AI Overview (guides, comparisons, studies, proof pages)?

 

The biggest winners are decision-support assets: structured comparisons, methodology pages, benchmarks, role-specific glossaries, proof pages (security, compliance, methodology, SLAs) and sourced studies. They remain valuable even when a user has already read a summary.

 

What should you do if a page ranks well but no longer gets clicks because of AI Overviews?

 

First diagnose it in Search Console (CTR down, ranking stable), confirm the SERP, then strengthen the page with non-summariseable elements (quantified tables, decision matrices, edge cases, tools). Finally, route that traffic to decision-stage content through internal linking to protect conversions.

 

Final question: where can you follow future resources on this topic on the Incremys blog?

 

To keep up with upcoming analyses and practical guides on AI Overviews, SEO and GEO, visit the Incremys Blog.

Discover other items

See all

Next-Gen GEO/SEO starts here

Complete the form so we can contact you.

The new generation of SEO
is on!

Thank you for your request, we will get back to you as soon as possible.

Oops! Something went wrong while submitting the form.