Tech for Retail 2025 Workshop: From SEO to GEO – Gaining Visibility in the Era of Generative Engines

Back to blog

Signals, Diagnosis and Fixes for a Google Penalty

SEO

Discover Incremys

The 360° Next Gen SEO Platform

Request a demo
Last updated on

15/3/2026

Chapter 01

Example H2
Example H3
Example H4
Example H5
Example H6

A Google penalty is a loss of visibility in search results, caused either by a manual action (with a notification) or by an algorithmic demotion (without a notification), according to Google Search Central. In 2026, the stakes are immediate for B2B sites: with Google commanding around 89.9% market share (Webnyxt, 2026) and rolling out 500–600 updates per year (SEO.com, 2026), a sudden drop in rankings can translate very quickly into fewer leads, or even partial deindexing.

This guide helps you understand the causes, build a robust diagnosis (without false positives) and follow a practical action plan to get back to sustainable organic growth, using concrete checks in Search Console and GA4.

 

Google penalties in 2026: understanding sanctions, their causes and the plan to recover

 

In SEO, a sanction occurs when Google believes a site does not follow its guidelines, or when its systems reassess the quality, trustworthiness or compliance of a set of pages. The impact can range from a simple drop in rankings for a handful of queries to deindexing (pages disappearing), or, in extreme cases, removal from results entirely.

 

Why this becomes critical with core updates and anti-spam enforcement

 

Ranking systems evolve continuously: Google says it processes 15% never-before-seen queries every day (Google, 2025), and the ecosystem experiences high volatility with several hundred adjustments per year (SEO.com, 2026). In that context, "penalties" in the broad sense are often confused with post-core update re-evaluations, but the difference is decisive: a manual action comes with an explicit reason and a defined process, whereas algorithmic demotions require diagnosis through triangulation.

 

Penalty trends in 2026: what changes and what stays the same

 

Three trends define 2026:

  • More continuous evaluation: systems historically known as "Panda" (content quality) and "Penguin" (links) have been incorporated into the core algorithm, with often more gradual impacts and ongoing reclassifications.
  • Higher risk of misdiagnosis: search is increasingly "zero-click" (60% according to Semrush, 2025). A fall in sessions can come from declining click-through rate rather than a sanction.
  • Greater pressure on anti-spam compliance: for context, Google sent more than 186 million spam-related messages to site owners in 2018 (source cited by Noiise). The volume shows how industrialised enforcement has become, even though not every drop is a penalty.

 

What you risk in practical terms: demotion, deindexing, lost leads

 

The risk shows up in both visibility and revenue:

  • Demotion: ranking losses across one or more query groups. The top 3 captures 75% of organic clicks (SEO.com, 2026), whilst page 2 drops to a 0.78% click-through rate (Ahrefs, 2025).
  • Deindexing: strategic pages can vanish from results, cutting acquisition on key intents.
  • Revenue or lead loss: each extra second of load time can cost roughly 7% in conversions (Google, 2025), and an SEO drop feeds directly into the pipeline when organic is a major channel.

 

Manual action vs algorithmic filter: identify the issue before you act

 

Before making changes, establish a binary diagnosis: manual action (with notification) or algorithmic demotion (without notification). This determines the method, the evidence to provide and the potential recovery timeline.

 

Understanding a manual action: signals, messages and limits in Search Console

 

A manual action is applied after a review by a Google team, typically following suspicion, a report or detected violations. The most reliable signal is a notification in Google Search Console under Security & Manual Actions > Manual actions. Google specifies the issue type and often provides example URLs (Google Search Central).

Common reasons referenced across the ecosystem include: unnatural links, user-generated spam, low-value content, cloaking or deceptive redirects, hidden text and keyword stuffing, spammy structured data, and hacking (as summarised by SEO guides and Google Search Central).

 

Algorithmic demotion: how to distinguish it from normal ranking volatility

 

An algorithmic demotion does not trigger a message. You infer it from symptoms: simultaneous drops in impressions and clicks, a broad decline across a set of queries, or a fall concentrated in a specific directory or page type. To avoid jumping to conclusions, also check a common 2026 scenario: stable rankings but lower click-through rate due to SERP changes (zero-click, rich features, AI overviews).

 

Build a reliable timeline: link the drop to updates (without bias)

 

Your goal is not to "prove" an update caused the drop, but to build an actionable timeline:

  • start date of the decline (impressions, clicks, rankings, indexation);
  • internal changes (migration, redesign, deployments, removals, link campaigns, page generation);
  • external events (volatility periods, core updates, spam updates).

Tools such as SISTRIX recommend reviewing a visibility chart and checking whether the timing matches known updates. Correlation is not causation, but it helps you find the root cause faster.

 

Why a penalty affects rankings: common causes of a sudden drop

 

Most causes fall into three buckets: content, links, and technical or abusive practices. The visible outcomes include ranking losses, less organic traffic, indexing issues, and sometimes deindexing.

 

High-risk content: duplication, overproduction, low-value pages

 

Historically, quality-related losses were associated with Panda. In practice, the most common triggers remain:

  • duplication (identical blocks across multiple URLs, near-duplicates, parameterised variants);
  • overproduction of similar pages generated at scale without distinctive value;
  • thin pages (too little content, not useful, not specific);
  • over-optimisation (artificial repetition, filler text).

A useful rule of thumb often mentioned in SEO guides is to keep your writing natural and avoid stuffing a page with repetitions. Some recommendations mention around 1% keyword density as a ballpark, but intent and perceived quality matter far more than a mechanical rule.

 

Problematic link building: link schemes, over-optimised anchors, dubious domains

 

Link-related drops reflect Penguin's legacy and manual actions for "unnatural links". Risk signals include:

  • rapid, unnatural backlink acquisition;
  • over-optimised, repetitive anchor text;
  • site networks, large-scale exchanges, non-compliant paid links;
  • links from clearly spammy domains.

For context, ranking #1 is associated with a high average number of backlinks (220 on average according to Backlinko, 2026). The point is not raw volume, but naturalness and quality.

 

Technical aggravators: hacking, injected spam, disrupted indexing

 

Some technical incidents can look like a penalty (or make one worse):

  • hacking and injected spam pages (often detectable via security alerts);
  • deceptive redirects or uncontrolled redirect chains;
  • disrupted indexing: inconsistent canonicals, explosion of parameter URLs, accidental noindex usage.

In parallel, performance and mobile experience strongly influence user behaviour: 53% of mobile users abandon if loading exceeds 3 seconds (Google, 2025). A conversion drop caused by speed can be misread as a penalty unless you cross-check SEO and user experience.

 

Panda and Penguin: what these filters really changed (and what still applies)

 

Panda (2011) and Penguin (2012) remain useful shorthand, even though their signals have been progressively integrated into core ranking systems. In 2026, the key takeaway is straightforward: low perceived content value and artificial popularity are still two major drivers of demotion.

 

Panda's legacy: content quality logic, symptoms and remedies

 

Panda targeted content quality and relevance. Typical symptoms include: traffic drops, declines across many informational queries, and less efficient indexing. The remedies remain consistent: reduce duplication, consolidate similar pages, enrich existing content, improve readability, and remove pages with no unique value.

 

Penguin's legacy: unnatural links logic, patterns and clean-up

 

Penguin targeted unnatural link profiles, especially paid links and over-optimised anchors. Today, systems often discount problematic signals rather than applying a blanket "punishment" across an entire site, but the commercial impact can be similar: losses on competitive queries and a decline in previously winning pages.

 

What to remember now: integrated signals, gradual effects and revaluation

 

Two operational implications:

  • recovery is often gradual: you typically see step-by-step improvements as systems reassess content and links;
  • there is no "off switch": you fix the causes and then wait for re-evaluation (especially for algorithmic impacts).

 

How to spot a Google penalty: a step-by-step diagnosis

 

A reliable diagnosis depends on segmentation (where? what?) and on combining Search Console with analytics (pre-click vs post-click). Never rely on a single metric.

 

Define the scope: site-wide, directory-level, page types and affected queries

 

Start by answering four questions:

  • is the drop site-wide or concentrated in a directory (e.g. /blog/, /products/)?
  • does it affect a template (similar pages)?
  • does it mainly affect non-brand queries (generic) or the brand as well?
  • is it worse on mobile (60% of global web traffic, Webnyxt, 2026) or desktop?

This segmentation drives your plan: a drop on one page type points to a structural cause (template, duplication, thin content). A drop on a query family may point to a mismatch with search intent.

 

Check Search Console: security, indexing and relevant alerts

 

In Google Search Console, prioritise:

  • Security & Manual Actions: any manual action or security issue.
  • Indexing: spikes in excluded URLs, "Crawled - currently not indexed", unexpected canonicals, sitemap anomalies (large gaps between submitted and indexed URLs).
  • Performance: trends (not day-by-day), and combined reading of impressions / average position / click-through rate / clicks.

Method reminder: Search Console data is not real-time. Analyse trends across days or weeks and cross-check with publishing and deployment events.

 

Read the curves: organic traffic, impressions, click-through rate and rankings (before and after)

 

Build a "before and after" view by isolating the break date:

  • impressions down + rankings down: likely demotion (or increased competition);
  • impressions stable + click-through rate down: likely SERP change or zero-click effect (Semrush, 2025);
  • impressions down + indexing down: indexing or crawl issue to address first.

In 2026, AI overviews make interpretation harder: impressions can rise while sessions fall. Treat that as something to investigate, not proof of a penalty.

 

Rule out alternatives: migration, redesign, seasonality and competition

 

Before removing content or submitting a disavow file, eliminate false positives:

  • migration or redesign (redirects, canonicals, internal linking, lost pages);
  • tracking issues (duplicate tags, consent, GA4 channel grouping);
  • seasonality and intent shifts;
  • competition (new entrants, fresher content, richer SERPs).

 

Action plan to recover from a Google penalty: prioritise for a lasting comeback

 

An effective plan follows a "stabilise → clean up → requalify → measure" logic. Limit parallel workstreams and document everything.

 

Phase 1 — Secure and stabilise: spam, hacking, redirects, parasitic pages

 

If Search Console flags a security issue (or you spot unknown pages indexed), address first:

  • site clean-up (injected pages, compromised accounts, vulnerable plugins);
  • removal of parasitic pages and closure of spam vectors (forms, unmoderated user-generated content);
  • redirect review (avoid deceptive redirects and long chains).

Goal: stop the bleeding before any optimisation.

 

Phase 2 — Improve content quality: audit, removal, merging, rewriting, consolidation

 

Work in batches (clusters, directories, templates):

  • identify weak pages (low traffic, low engagement, duplicates, cannibalisation);
  • consolidate similar pages (merge + redirect) rather than multiplying near-identical URLs;
  • rewrite strategic pages to match intent more precisely and reduce generic content.

 

Content pruning: decide what to keep, improve or remove (with measurable criteria)

 

Decide based on observable criteria, not gut feel. Example of a simple framework:

  • Keep and strengthen: pages with impressions and rankings 4–15 (potential), or pages contributing to conversions.
  • Merge: pages cannibalising each other (same queries, same intent) or variants that are too similar.
  • Remove (or noindex): pages with no demand, no unique value, or structural duplication.

Then track the impact on indexing (coverage), impressions, click-through rate and conversions (GA4).

 

Phase 3 — Clean up your link profile: audit, removals, disavow if needed

 

For link-related issues, the standard process is:

  • inventory backlinks (Search Console provides a partial but useful view);
  • identify risky domains and anchors (spikes, networks, repetitive anchors);
  • request removals when possible;
  • use disavow cautiously, especially if you have a manual action for unnatural links or strong evidence of toxicity.

 

Phase 4 — Fix indexing and crawling: canonicals, noindex, URL duplication

 

This phase aims to make the site unambiguous for Google:

  • align internal links, redirects and canonical tags (one canonical version per page);
  • reduce indexable parameter URLs and technical duplication;
  • ensure noindex is not affecting business-critical pages;
  • monitor "submitted vs indexed" in sitemaps.

 

Phase 5 — Re-validate and monitor: realistic timelines, expected signals, checkpoints

 

Timelines vary widely depending on the issue type:

  • Manual action: SISTRIX mentions around 30 days for a minor violation, sometimes longer. A reconsideration request can help, without guarantees.
  • Algorithmic demotion: recovery depends on re-evaluation. SISTRIX cites long clean-up cases (up to two years in some Panda histories), showing why deep fixes matter more than superficial tweaks.

Look for concrete signals: stabilised exclusions, gradual return of impressions, recovery of non-brand queries, improvement on money pages and landing pages.

 

Lifting a manual action: get it removed without wasting time

 

A manual action should be handled like a case file: comprehensive fixes, evidence, and a well-structured reconsideration request (Google Search Central).

 

What Google expects: evidence, fixes and preventive measures

 

Google typically expects:

  • the issue to be fixed across the full scope (not just a few examples);
  • actions that match the stated reason (links, spam, cloaking, thin content, etc.);
  • preventive measures (process, moderation, publishing controls, governance).

 

How to structure the request: checklist, facts and exact scope of fixes

 

Recommended structure for a reconsideration request:

  • summary of the reason and scope (site, directory, page types);
  • dated list of actions taken, by category;
  • evidence (e.g. examples of fixed URLs, removed pages, anti-spam measures implemented);
  • commitment to prevention (workflow, quality control, link review).

Stay factual: no intentions, only what was fixed and how you will prevent recurrence.

 

After the response: what to do if it is rejected or partially lifted

 

If rejected, treat the response as a clue to what remains non-compliant. Re-check scope and look for blind spots (old directories, subdomains, orphan pages, user-generated content). If it is partially lifted, isolate the still-affected sections and reapply the "evidence + completeness" approach.

 

Measure results and steer the return to performance

 

Measuring recovery helps you avoid two costly mistakes: deciding too early that "it isn't working", or restarting content production before the root cause is stable.

 

Recovery signals: reindexing, coverage, returning queries, winning pages

 

In Search Console, monitor:

  • index coverage (declining problematic exclusions);
  • the reappearance of non-brand queries and the recovery of previously winning pages;
  • rankings and click-through rate on queries where you were in the top 3 (because that is where most clicks happen).

 

Business impact: leads, conversions, landing pages and attribution

 

Search Console measures "before the click"; GA4 measures "after the click". For B2B reporting:

  • in GA4, isolate the Organic Search channel and track engaged sessions, engagement time, micro-conversions (CTA clicks, form starts) and primary conversions (demo, quote or contact requests);
  • analyse organic landing pages that lost (or regained) pipeline contribution;
  • avoid the trap of "GSC clicks ≠ GA4 sessions": differences are normal (consent, ad blockers, redirects, definitions).

To prioritise high-impact work, tie recovery to value, for example with reporting focused on SEO ROI.

 

Set up a continuous improvement loop: recurring audits and prevention

 

A simple routine works well:

  • weekly: alerts (security, manual actions), indexing anomalies, large swings on key pages;
  • monthly: identify pages with potential (positions 4–15), refresh strategic pages, review new links and new content.

Given how frequent updates are, a one-off annual audit is not enough in 2026.

 

Mistakes to avoid when dealing with a Google penalty

 

 

Diagnosis mistakes: confusing penalties, updates, seasonality and technical incidents

 

  • relying only on a drop in sessions without checking impressions, click-through rate and rankings;
  • ignoring SERP changes (zero-click, rich features);
  • failing to segment (mobile vs desktop, directories, brand vs non-brand).

 

Fix mistakes: mass deletions, poorly scoped disavows, untracked changes

 

  • deleting hundreds of pages without mapping intent and cannibalisation;
  • disavowing "by default" without evidence or logic (risking the loss of useful signals);
  • making multiple changes at once without a release log.

 

Monitoring mistakes: no tracking, too many simultaneous changes, lack of documentation

 

  • not monitoring index coverage after fixes;
  • changing content, links and technical setup simultaneously, making attribution impossible;
  • not documenting fixes (which hurts reconsideration requests).

 

Best practices to reduce the risk of another penalty (without over-optimising)

 

 

Editorial safeguards: quality standards, review, updates and governance

 

  • define standards (originality, usefulness, sources, structure);
  • add review and validation (especially if you produce at scale with AI);
  • update pages that drive the business rather than constantly creating new URLs.

Content benchmark: page-one results tend to be content-rich (around 1,890 words on average according to SEO.com, 2026). Without aiming for artificial length, it is a reminder to provide a complete, well-structured answer.

 

Link safeguards: acceptance criteria, diversity, monitoring anchors and domains

 

  • prioritise links earned for genuine editorial reasons;
  • monitor anchor diversity (avoid exact-match repetition);
  • watch for sudden backlink spikes and investigate quickly.

 

Technical safeguards: security, monitoring, alerts and release controls

 

  • enable security monitoring and address critical alerts first;
  • check canonicals, redirects and noindex after every release;
  • maintain reliable tracking (avoid duplicate tags and tracking breaks).

 

What to do if you are stuck: targeted fixes, a redesign, or a domain change

 

 

Compare approaches: costs, timelines, risks and benefits

 

  • Targeted fixes: controlled cost, better traceability, lower risk, but requires rigorous execution.
  • Redesign: useful if technical debt or structural duplication prevents fixes, but high risk (migration, temporary losses).
  • Domain change: only as a last resort, as it transfers trust poorly, complicates tracking and does not necessarily remove the causes (links, content, practices).

 

When to prioritise fixes over producing new content

 

If indexing is unstable, a manual action exists, or an entire template is low-value, publishing more content amplifies the problem. Prioritise fixes when: (a) existing pages are no longer properly indexed, (b) click-through rate collapses on key landing pages, (c) security or manual action alerts are present.

 

Balancing short term and long term: quick wins, clean-up, relaunch

 

Reasonable quick wins often mean restoring existing pages that were historically indexed and performed well (Google typically reacts faster to an existing page than to a brand-new one). Then restart production on healthy foundations, with a governed editorial plan.

 

Useful tools in 2026 to diagnose, fix and monitor

 

 

Search Console: reports to prioritise

 

  • Manual actions and Security: your number-one entry point.
  • Performance: impressions, clicks, click-through rate, average position, segmentation by pages, queries, devices and countries.
  • Indexing: exclusions, sitemaps, URL inspection, selected canonicals.
  • Links: a helpful (not exhaustive) view to spot linked pages and obvious anomalies.

 

Logs and crawling: understand crawling and detect anomalies

 

Where possible, use server logs to understand Googlebot behaviour (frequency, directories crawled, response codes), and complement this with a crawl (structure, duplication, canonicals, redirects). This is particularly valuable after a redesign or when indexing deteriorates.

 

Link analysis: assess risk and document actions

 

For links, what matters is documentation: which sources are problematic, which anchors are overrepresented, which removals were requested, and what was disavowed (if applicable). This documentation becomes critical if you submit a reconsideration request.

 

Dashboards: rankings, segments, pages and business KPIs

 

Build a dashboard that connects: visibility (Search Console) → landing pages → engagement and conversions (GA4). To set targets, use benchmarks from our SEO statistics and, if you track visibility in generative engines, our GEO statistics.

 

A word on Incremys: audit and prioritise actions without overloading the team

 

When a drop looks like a penalty, the hardest part is often prioritisation: what to tackle first, across which scope, and how to measure the impact. Incremys is a GEO and SEO optimisation SaaS platform that helps you structure this diagnosis and turn it into an actionable roadmap by combining technical, semantic and competitive signals.

 

Centralise diagnosis and remediation planning with the audit SEO & GEO 360° Incremys

 

To speed up qualification (manual action vs demotion) and isolate the sections to fix, the audit SEO & GEO 360° Incremys consolidates findings and helps prioritise actions without multiplying manual analyses. For a broader view of the modules, you can also consult the SaaS 360 platform.

 

FAQ: penalties, diagnosis and recovery

 

 

What is a penalty, and why does it matter in 2026?

 

It is a loss of visibility on Google due either to a manual action (guideline violations) or an algorithmic reassessment. In 2026, it matters because updates are frequent and clicks are concentrated in the top 3 (75% of organic clicks according to SEO.com, 2026).

 

What is the impact on rankings and revenue?

 

For rankings, the impact ranges from demotion to deindexing. For revenue, the effect is indirect but fast: fewer pages in high-click-through-rate areas means fewer sessions and fewer leads. In B2B, this often shows up as fewer contact or demo requests and a weaker pipeline.

 

How can you tell whether the drop is from a manual action or an algorithmic filter?

 

First check Search Console: if there is a manual action, you will see a message in Security & Manual Actions. With no message, suspect an algorithmic demotion and run a segmented before-and-after analysis (impressions, rankings, click-through rate) by directories, pages, queries, countries and devices.

 

How long does recovery take?

 

For a manual action, timing depends on severity and processing time, with a benchmark of a few weeks in some cases (SISTRIX), without guarantees. For algorithmic demotions, recovery depends on re-evaluation and can take longer, sometimes across multiple update iterations (SISTRIX).

 

What mistakes should you avoid during the fix phase?

 

Avoid unmeasured mass deletions, "preventive" disavows without evidence, and multiple simultaneous changes. Document every action and keep a clear timeline to understand what is (or is not) working.

 

Which practices reduce the risk of recurrence?

 

Editorial standards (originality, usefulness, consolidation), link building based on natural and diversified links, and technical safeguards (security, canonicals, redirects, monitoring). Prevention is mostly about governance and regular checks.

 

How do you set up a prevention and control process?

 

Implement a weekly routine (alerts, indexing, critical pages) and a monthly routine (refresh strategic pages, review new content, check new backlinks). Keep a deployment and campaign log to avoid guesswork.

 

Which tools should you use in 2026 to diagnose and measure progress?

 

Prioritise Search Console (manual actions, security, performance, indexing), complemented by GA4 to measure business impact (engagement, micro-conversions, conversions). Add logs or crawling if indexing or crawling appears to be the issue.

 

How do you incorporate penalty management into an overall SEO strategy?

 

Treat it as a risk stream alongside content and technical SEO: monitoring, regular audits, publishing governance, and business-led KPIs. The aim is to spot early warning signals (click-through rate, indexing, landing pages dropping) before a decline becomes structural.

 

Should you consider a major redesign or changing domains?

 

Only if targeted fixes are impossible (major technical debt, unmanageable architecture, recurring spam you cannot control). A redesign or domain change increases risk (migration, temporary losses, measurement complexity) and never replaces fixing root causes.

Discover other items

See all

Next-Gen GEO/SEO starts here

Complete the form so we can contact you.

The new generation of SEO
is on!

Thank you for your request, we will get back to you as soon as possible.

Oops! Something went wrong while submitting the form.