15/3/2026
To place this method within a broader context, start with the parent article on SEO audit. Here, we go deeper into a highly practical angle: a site SEO audit focused on the on-site checks that often make the biggest difference to visibility and click-through rate: page depth, internal linking, duplicate content, cannibalisation and meta tags.
Site SEO Audit: A Complete Method to Diagnose and Prioritise (2026 Edition)
A website search ranking audit involves establishing a factual baseline (pages, SEO signals, performance) and turning that diagnosis into executable decisions. According to Google Search Central, anything preventing crawling, indexing and understanding of a page mechanically limits its ability to rank. In 2026, the challenge extends beyond "rankings": it also involves improving SERP readability (titles, snippets) and increasing the likelihood of appearing in generative answers (GEO).
In practice, this site-level audit primarily answers three questions:
- Accessibility: are your high-value pages actually reachable (depth) and supported by useful internal links?
- Clarity: does each URL map to a clearly identifiable intent, without duplication or internal competition?
- SERP appeal: do titles, meta descriptions and robots directives maximise CTR without misleading promises?
What This Guide Adds vs the Incremys SEO Audit (and What It Does Not Cover)
This guide complements the broader perspective presented in the parent "SEO audit" article by focusing on on-site checks that are often underestimated: internal linking, depth analysis, duplicate content, cannibalisation and meta tags.
To avoid duplication with other resources, this content does not cover:
- technical SEO audits (crawling, server errors, Core Web Vitals, directives, etc.);
- general "SEO analysis" approaches and tool overviews beyond Google Search Console and Google Analytics;
- broader "website audit" and "website analysis" topics (user experience, compliance, security, etc.).
Why Run a Site Audit: Objectives, KPIs and B2B / E-Commerce Use Cases
In B2B, a study cited by ITI Conseil indicates that 50% of visits come from organic search. When SEO contributes so significantly to acquisition, on-site checks become a governance matter: duplicate content or cannibalisation can dilute performance on commercially oriented queries, even when "everything appears fine" at first glance.
Examples of objectives (and their KPIs) in 2026:
- Increase clicks with the same number of impressions: CTR (Search Console), title and meta optimisation (our SEO statistics highlight that an optimised meta description can influence CTR, and that snippet structure matters more as zero-click searches increase).
- Stabilise volatile rankings: share of URLs alternating for the same query (cannibalisation), changes in average position (Search Console).
- Make key business pages visible: average depth of high-priority pages, number of internal inlinks to those pages, proportion of orphan pages.
- E-commerce: reduce duplication caused by facets and parameters, reduce listing depth, stabilise category pages that match demand.
Note: with Google still dominant in France (market share remains close to ~90% according to Webnyxt 2026, cited in our SEO statistics), Search Console KPIs remain central. However, in 2026, it is worth adding a GEO lens (citability, structuring, direct-answer readiness) when prioritising.
Key Checks at a Glance: Internal Linking, Meta Tags, Semantics and Content Quality
The most "profitable" on-site checks are those that improve access and interpretation at scale:
- Depth and orphan pages: a strategic page that is 6–8 clicks away, or has no internal links, is unlikely to be crawled at the right cadence as the site grows.
- Internal linking: topical hubs, contextual links, intent-aligned anchors, reducing links to non-indexable URLs.
- Duplicate content: near-identical templates, boilerplate blocks, e-commerce variants, canonical and noindex conflicts.
- Cannibalisation: several pages competing for the same query intent → URL swapping, diluted CTR, difficulty pushing one page up.
- Meta tags: unique, non-truncated titles and meta descriptions aligned with page and intent; error-free meta robots directives.
Step-by-Step Method to Audit a Site Without Blind Spots
The logic is straightforward: start from observable data, avoid context-free "scores", then turn findings into a prioritised backlog (impact × effort × risk), as recommended in the parent article's methodology.
Step 1: Define the Scope, Objectives and Data Access
Before any analysis, define:
- scope: the whole domain, a directory (e.g. /blog/), or a business segment (e.g. solution pages);
- the main objective (visibility, CTR, leads, sales) and 2–3 KPIs maximum;
- evidence sources: Google Search Console (queries, pages, indexing) and Google Analytics (landing pages, engagement, conversions).
Without this framing, audits often become a catalogue of issues with no arbitration, and teams end up fixing "noise" rather than true blockers.
Step 2: Analyse Page Depth to Reveal Under-Crawled Areas
Goal: identify where architecture and navigation bury useful content. Look for patterns:
- business pages beyond a depth threshold;
- directories concentrating deep pages (often filters, tags, or template-generated pages);
- a mismatch between what matters to the business and what the internal linking actually supports.
Always interpret depth carefully: a deep page can still perform if it receives strong internal links (hubs, contextual links) and external signals. The audit helps you separate exceptions from structural problems.
Step 3: Audit Internal Linking (Anchors, Hubs, Orphan Pages and Journeys)
A strong internal linking audit does not just count links: it evaluates their role in understanding and distributing internal authority.
- Hubs: do pillar pages exist to structure the topic and link to supporting pages?
- Journeys: do organic entry pages naturally lead to an action (contact, demo, basket, download)?
- Anchors: do they reflect intent (without over-optimisation) and help the user?
- Orphan pages: useful pages with no internal links (often after redesigns and migrations or one-off publishing).
A practical rule: if a page matters to revenue (or pipeline), it should have a clear place in the architecture and contextual links from related content.
Step 4: Assess Duplicate Content (Templates, Variants and Canonicalisation)
Duplication is not only editorial; it is often driven by templates (listings, product pages, local pages), variants, or repeated blocks. The risk: overwhelming Google with overly similar pages and creating contradictory signals (canonical, indexability, internal links).
What you should classify:
- Structural duplication: near-identical pages generated by parameters, facets, sorting, tags.
- Boilerplate duplication: repeated sections (the same FAQ everywhere, overly long legal blocks, unchanged marketing blocks) that reduce truly distinctive content.
- Intentional duplication: similar-but-useful pages (e.g. two genuinely different offers); here, the audit must ensure differentiation is explicit (angles, proof points, lexical fields).
Step 5: Analyse Keyword Cannibalisation (Merge, Reposition, Redirect)
Cannibalisation happens when multiple URLs target (or drift towards) the same intent, often due to:
- content production without intent-to-URL mapping;
- category pages and blog posts overlapping;
- "guide" pages absorbing transactional queries without converting.
Common Search Console signals: URL swapping for a query, impressions split across several pages, and volatile positions. Typical action plans involve merging (best page + consolidation), repositioning (distinct angles and intents), or redirecting when a URL no longer has a reason to exist.
Step 6: Check Meta Tags (Title, Meta Description, Robots) and SERP Consistency
Meta tags are the interface between your page and the SERP. According to our SEO statistics, organic performance is strongly tied to CTR: position 1 can reach 34% CTR on desktop (SEO.com, 2026), whilst page 2 drops to 0.78% (Ahrefs, 2025). In that context, generic or duplicated titles are expensive.
- Title: unique, descriptive, intent-aligned, without overpromising.
- Meta description: benefit- and proof-oriented, consistent with the page (avoid "bait").
- Meta robots: check for accidental noindex on strategic pages, and consistency with your indexing strategy.
Step 7: Run a Semantic Audit: Intent-to-URL Mapping and Topic Coverage
An on-site semantic audit reduces two risks: (1) "catch-all" pages mixing several intents, (2) creating new pages when consolidation would improve performance faster.
In practice:
- assign one primary intent to one URL (then document acceptable secondary intents);
- identify missing topics and consolidation candidates (several weak pages targeting the same intent);
- structure for extraction (Hn, lists, short definitions, self-contained paragraphs), helpful for SERPs and generative visibility.
For a strictly on-page focus, you can also complement this with on-page SEO audit: how to analyse each page and fix issues (useful when you want to work at page and template level without expanding into other dimensions).
Step 8: Turn Analysis Into a Backlog, Roadmap and Acceptance Criteria
A good audit does not end with recommendations; it produces actionable tickets:
- Backlog: issue → evidence → fix → impacted pages and templates → validation KPI.
- Roadmap: sequencing quick wins vs structural work, release dependencies, risks.
- Acceptance criteria: how will you unambiguously confirm the fix is live (without regressions)?
Page Depth Analysis: Measuring Real Content Accessibility
Depth is not a "score": it is an indicator of real accessibility. On growing sites, it is often one of the best signals of architecture and internal linking debt.
Click Depth: Useful Thresholds, Interpretation Biases and Crawl Impacts
A commonly used rule of thumb is to aim for key pages to be reachable within three clicks when realistic (adapt to volume and site type). Beyond that, the risk is not only discovery: internal authority distribution and crawl frequency can also diminish.
Biases to avoid:
- confusing "theoretical" depth (menus) with "real" depth (contextual links);
- auditing depth without segmenting by template (categories, products, articles, offer pages);
- fixing page by page rather than fixing a template or hub.
Orphan Pages: Causes, Detection and a Recovery Plan
An orphan page has no internal links pointing to it. It may still receive traffic (campaigns, external links, direct access). Recovery plan:
- if the page is useful: attach it to a hub and add contextual links from 2–5 related pieces of content;
- if it is redundant: consolidate (merge) or remove (whilst staying consistent with your URL strategy);
- if it exists "by mistake" (automatic generation): fix the root cause.
Site Architecture, Templates and Navigation: Spotting Bottlenecks
The most common bottlenecks come from:
- deep paginated listings with no cross-links;
- navigation that prioritises exhaustiveness (too many levels) over key pages;
- an accumulation of low-value "intermediate" pages (tags, archives, filters) that dilute crawling.
Internal Linking Audit: Distributing Authority Without Damaging User Experience
Internal linking is an SEO lever, but it also supports understanding and conversion. The goal is not to add links everywhere; it is to make the site readable (for crawlers and humans) and elevate the pages that matter.
Mapping Links: Pillar Pages, Supporting Pages and Conversion Journeys
Start with a simple map:
- Pillar pages: pages that structure a topic and concentrate internal authority.
- Supporting pages: more specific content answering precise questions and linking back to the pillar (and/or an offer page).
- Business pages: offers, categories, demo pages, contact.
Then check whether the journey "SEO entry → understanding → reassurance → action" actually exists in your links (not just in your menu).
Optimising Anchor Text: Variety, Precision and Intent Alignment
Best practices:
- use descriptive anchors (not "click here");
- vary phrasing naturally (avoid exact repetition at scale);
- align anchors with the target page intent (otherwise you create semantic confusion).
Editorial Links vs Navigation: What to Improve First
General priority order:
- Contextual editorial links to high-value pages (often more impactful because they are close to the topic and need).
- Hubs (pillar pages) that organise a cluster and reduce cannibalisation.
- Navigation (menus, footer): useful, but be wary of unnecessary sitewide links.
Common Mistakes: Over-Optimisation, Unhelpful Sitewide Links and Excessive Depth
- Over-optimisation: the same exact-match anchors everywhere, mechanical repetition.
- Sitewide links to secondary pages: they inflate link volume without improving understanding.
- Depth: relying only on navigation, without contextual links, makes content increasingly "invisible" as the site grows.
Duplicate Content Assessment: Reducing Noise and Stabilising SEO Signals
Duplicate content increases noise: more pages to crawl, more contradictory signals, and more difficulty identifying the one page to push.
Internal Duplication: Facets, Parameters, Versions and Near-Identical Pages
Typical sources: sort parameters, facets, variants, tag-generated pages, and very similar pages from the same template. E-commerce risk: filter combinations can generate thousands of similar URLs that compete with your main categories.
Editorial Duplication: Repeated Sections, Reused Blocks and Boilerplate Content
Boilerplate is not inherently bad (navigation, reassurance, notices). The issue starts when the unique part becomes too small: Google perceives pages as near-identical, and users do not find differentiated value.
Choosing the Right Response: Rewrite, Consolidate, Noindex, Canonical or Redirect
A quick decision guide:
- Rewrite if intent is different but poorly expressed.
- Consolidate if multiple pages serve the same intent (often the best anti-cannibalisation move).
- Noindex if the page is useful to users but should not rank (use carefully and keep internal linking consistent).
- Canonical if the page is a technical variant of a reference page.
- Redirect if the page is no longer useful and a clear equivalent exists.
Cannibalisation Analysis: When Multiple Pages Compete for the Same Intent
Cannibalisation is one of the most costly blockers on mature sites: it spreads impressions and prevents a single URL from building lasting authority for a query.
Signals to Watch: URL Swapping, Volatility and Diluted Performance
- a query triggers different URLs depending on the week;
- impressions rise without clicks (diluted CTR, inconsistent snippets);
- pages stagnate at the bottom of page 1 and top of page 2 despite repeated optimisation.
Selecting the Target Page: Decision Criteria and Evidence to Gather
Useful criteria:
- intent alignment (informational vs transactional);
- ability to convert (GA4: key events, conversions, assists);
- content quality and freshness;
- existing internal linking (internal inlinks to each candidate).
Action Plans: Merge, Differentiate, Correct Internal Linking and Redirects
- Merge: create a reference page and redirect variants, keeping the best content.
- Differentiate: clarify angles (e.g. "comparison" vs "guide" vs "offer page"), and adjust titles, Hn and intro.
- Internal linking fixes: point supporting pages to the target page and reduce internal links to secondary pages.
Meta Tag Audit: Improving Relevance Without Over-Optimising
Meta tags influence understanding and, above all, SERP CTR. According to MyLittleBigWeb (2026), an optimised meta description can increase CTR (effect depends on context). In 2026, it is also crucial to avoid a mismatch between snippet promise and the actual content, which damages engagement.
Title Tag: Structure, Uniqueness, Promise and Page Consistency
- one clear promise (benefit or expected outcome);
- one proof point or qualifier (2026 edition, method, checklist, etc.) where relevant;
- strict uniqueness across the site (avoid repeating "Home", "Products", "Blog").
Meta Description: Improving CTR Without Shifting Intent
A good reflex: write as if you are answering the user before they click, stating what they will get (method, deliverables, use cases) and avoiding generic wording.
Meta Robots: Noindex, Nofollow and Directive Errors on Key Pages
Classic issues come from a noindex copied into a template, or from an approach that noindexes pages that are still heavily supported by internal linking. Consistency rule: if you do not want an area indexed, do not push it via priority internal links.
Common Problems: Duplicates, Truncation, Generic Titles and Brand and Non-Brand Inconsistencies
- Duplicates: identical titles and descriptions on similar pages (often templates).
- Truncation: titles that are too long, losing useful information in the SERP.
- Generic snippets: no benefit and no precision.
- Brand vs non-brand: inconsistency that blurs intent (and sometimes harms conversion).
On-Site Semantic Audit: Structuring Coverage and Avoiding "Catch-All" Pages
An on-site semantic audit organises what already exists so each page has a clear role. This reduces cannibalisation, improves internal linking, and also supports generative engines when content is more extractable.
Mapping Intent to a URL: Simple Rules
- one URL = one documented primary intent;
- if two intents are genuinely different, split them;
- if they are close, consolidate and structure (sections, H2 and H3, FAQ).
Topic Coverage: Missing Topics, Outdated Content and Consolidation Opportunities
Work in clusters: pillar page + supporting pages. In 2026, where a large share of queries is long-tail (our SEO statistics cite 70% of searches with more than 3 words according to SEO.com, 2026), structured topic coverage often beats multiplying isolated pages.
Editorial Structure: Headings, Paragraphs, Lists and Easily Extractable Elements
Quick checklist:
- informative H2 and H3 headings, not generic;
- short paragraphs and clear definitions;
- lists where relevant;
- citable elements (data, method, criteria) to strengthen credibility.
Specific Case: E-Commerce Site SEO Audit
On an e-commerce site, on-site underperformance is often systemic: catalogue depth, facets and parameters, template duplication, and competition between categories and products.
Category Pages vs Product Pages: Targeting the Right Intents Without Dilution
In general:
- category pages cover generic intents ("buy X", "cheap X", etc.);
- product pages cover specific intents (brand, model, reference).
When a product page tries to rank for a generic query, it can cannibalise a more suitable category (and vice versa). The audit must decide which page is the target for each intent.
Filters, Facets and Parameters: Managing Depth and Limiting Duplication
Facets can generate massive duplication. Best practice is to decide which combinations deserve indexing (search value and business value), and neutralise the rest so crawl resources are not spent at the expense of strategic pages.
Out-of-Stock and End-of-Life Products: URL Rules to Preserve Value
Without going into a technical checklist, the principle is to preserve the SEO value of URLs that have already earned visibility: when a product is discontinued, decide case by case between keeping an informational page, consolidating to a category, or redirecting to a relevant equivalent. The audit should formalise a clear rule that scales across the catalogue.
Interpreting Results: Moving From Diagnosis to Decisions
A useful on-site audit turns findings into decisions. This is especially true when the analysis surfaces thousands of items (duplicate metas, links, similar pages), but only a few root causes account for most of the impact.
Avoiding False Positives: Separating Anomalies From True Blockers
Typical false positives include:
- a missing meta description on a non-strategic page;
- a theoretical duplicate in a deliberately non-prioritised area;
- an isolated alert not supported by Search Console (impressions, indexing, clicks).
Principle: cross-check the crawl snapshot with Search Console (what is actually happening in Google) and Analytics (business value after the click).
Expected Deliverables: Summary, Prioritisation, Recommendations and Evidence
Client-facing deliverables (B2B) typically include:
- Summary: 5–10 key findings with evidence and estimated impact.
- Backlog: actionable tickets grouped by template and directory.
- Roadmap: sequencing, dependencies, validation criteria.
- Evidence appendices: Search Console and Analytics extracts, URL examples, affected queries.
Analysis Pitfalls: Overweighting Scores, Ignoring Intent, Forgetting Business Impact
- Overweighting scores instead of tying each point to a KPI (indexing, CTR, conversion).
- Ignoring intent: turning an informational page into a transactional-looking page (or the reverse).
- Forgetting the business: prioritising low-value pages over offer and category pages.
Prioritising After the Audit: From Issue List to an Executable Roadmap
The best audit is the one that results in 10 well-sequenced decisions rather than 300 untriaged tasks.
Impact × Effort × Risk Matrix: Sort and Sequence Quickly
Use a simple matrix:
- Impact: expected effect on crawling, indexing, CTR, rankings, conversions.
- Effort: time, dependencies, release cycle.
- Risk: potential regression, template breakage, traffic loss.
Prioritisation example: improving internal linking on a pillar page that supports 20 pages can be more valuable than adding 20 isolated meta descriptions.
Quick Wins vs Structural Work: What to Tackle First on an Existing Site
- Quick wins: duplicated titles on high-impression pages, internal linking to business pages, consolidating 2–3 cannibalised pieces of content.
- Structural work: redesigning navigation architecture, facet strategy, rebuilding hubs, governing intent-to-URL mapping.
Acceptance Criteria: Validating Fixes Without Regressions
Define before release:
- which pages and templates are impacted;
- what must change (e.g. unique title, anchor, link, consolidation);
- the expected KPI and measurement window (SEO often takes weeks, sometimes months).
Cost and Frequency: Framing an Audit Approach in 2026
In 2026, cost depends less on the "audit promise" than on volume, complexity (e-commerce, international), and the level of deliverables expected (backlog, roadmap, co-construction).
How Much Does an SEO Review Cost? Key Pricing Factors (Size, Complexity, Objectives)
Rather than quoting a single price (too variable), focus on what really drives budget:
- Number of URLs and template diversity (more templates = more cases).
- E-commerce complexity (facets, variants, stock, pagination).
- Objectives (basic diagnosis vs detailed backlog + acceptance criteria + support).
- Data quality (well-configured Search Console and Analytics or not).
A helpful way to decide: if SEO is a major acquisition channel, treat an audit as a management investment (diagnosis + prioritisation), not a one-off deliverable.
How Often Should You Re-Run an Audit? A Recommended Routine Based on Change and Scale
Common triggers: launching new sections, redesigning templates, adding facets, a drop in clicks and CTR, or stagnation on business queries. As a routine, regular reviews (quarterly for higher-risk areas, lighter for the rest) help prevent debt accumulation (duplication, inconsistent internal linking, cannibalisation).
For a local-specific need, see also local SEO audit methodology to improve local visibility.
Tools and Data: What to Use Without Stacking Solutions
For an on-site audit, the goal is to rely on a small set of tools but strong evidence: Google Search Console, Google Analytics, and an audit module that produces an actionable site map.
Google Search Console: Queries, Pages, Indexing and Actionable Signals
- queries and pages generating impressions and clicks;
- high-visibility pages with low CTR (meta opportunity);
- URL swapping for a query (cannibalisation opportunity);
- indexing signals to connect anomalies with real impact.
Google Analytics: Landing Pages, Engagement and Contribution to Conversions
- organic landing pages and traffic quality;
- key events (leads, add to basket, demo clicks);
- mobile vs desktop differences (useful if snippets drive clicks but the experience discourages users).
Going Further With the Incremys Module: Automated Scanning, Diagnosis and Ongoing Monitoring
To industrialise the approach, the SEO audit module by Incremys scans the entire domain (structure, content, technical, backlinks) and produces a comprehensive diagnosis. From an on-site perspective, the value is moving quickly from "where are the issues?" to "which templates and URLs are affected?".
A key point in 2026: do not treat an audit as a snapshot. Continuous monitoring and alerts help you spot regressions (duplicate content spikes, titles reverting to generic, internal linking breaking after a release) before click losses become entrenched.
Embedding the Audit in a 360° SaaS Platform to Manage SEO and GEO
When the aim is to connect auditing, content production, rank tracking and ROI measurement, it helps to keep everything in a single operating chain. Incremys offers a 360° SaaS platform to manage SEO and GEO end to end, enabling diagnosis → prioritisation → execution → tracking without multiplying environments.
The Incremys Approach: Industrialising Audits and Long-Term Monitoring
Incremys positions itself around a factual approach: deliver a readable diagnosis, turn it into an action plan, then monitor to protect performance over time.
Scanning a Site: Structure, Content, Technical and Competition, With Clear Reporting
The scan helps map the site and reveal patterns: deep areas, groups of duplicated pages, cannibalised clusters, repeated meta tags. The goal is not "zero alerts", but to identify what blocks and what amplifies performance.
For overall consistency, refer back to the pillar approach on the Incremys site (framework, signals, prioritisation).
Proactive Alerts: Detecting Regressions Before They Impact Traffic
On-site regressions are common: template changes, adding identical blocks, new categories, facets, partial migrations. A tool-driven, continuous approach helps you quickly spot:
- a sudden rise in duplication;
- titles falling back into generic patterns;
- business pages getting buried (depth) after navigation changes.
Co-Construction: Review, Decisions and an Action Plan With a Dedicated Consultant
In B2B, execution is still the decisive factor. The Incremys approach emphasises co-construction: reviewing results, making trade-offs (impact and effort and risk), then formalising an executable roadmap with validation criteria.
Frequently Asked Questions: Site SEO Audits
What is an SEO audit, and what is it for on a website?
An SEO audit identifies, with evidence, what is holding back a site's organic visibility (crawling, indexing, understanding, CTR) and defines a prioritised action plan. On a website, it mainly helps make key pages accessible, unique (no duplication and cannibalisation) and attractive in the SERP.
What should you check first in an on-site audit?
Prioritise checks that impact groups of pages: depth (accessibility), internal linking (distribution), duplicate content (noise), cannibalisation (internal competition) and meta tags (CTR + SERP consistency).
How do you audit a site step by step without spreading yourself too thin?
Define objectives and KPIs, analyse depth, audit internal linking, address duplicate content, diagnose cannibalisation, review titles, meta descriptions and meta robots, then turn it into a backlog + roadmap + acceptance criteria.
Which tools should you use to analyse a site without multiplying software?
Google Search Console for queries, pages and indexing, Google Analytics for engagement and conversions, and a site-scanning and audit module to map issues at site scale (such as the Incremys audit module).
What deliverables should you expect at the end of an audit?
A summary of key findings, a backlog of tickets (by template and directory), a prioritised roadmap (impact × effort × risk), evidence (Search Console and Analytics extracts) and acceptance criteria to validate fixes.
How do you interpret an audit without overreacting to scores?
By systematically cross-checking alerts with Search Console (impressions, clicks, indexing) and Analytics (value after the click). An isolated anomaly with no measurable impact is not necessarily a priority.
How do you prioritise internal linking, meta tags, duplicate content and cannibalisation?
In general: (1) duplicate content and cannibalisation when they dilute a commercial intent, (2) internal linking to push target pages and reduce depth, (3) meta (title and description) first on high-impression pages with weak CTR.
What are the most common mistakes in an SEO audit?
Stockpiling recommendations without prioritisation, fixing page by page rather than fixing templates, ignoring intent (catch-all pages), letting e-commerce duplication spread, and forgetting acceptance criteria before release.
What budget should you plan for an audit in 2026?
Budget mainly depends on the number of URLs, the number of templates, e-commerce complexity (facets and variants), and deliverable depth (diagnosis only vs detailed backlog + support). Avoid comparing solely on claims of a "complete audit" without evaluating deliverables and the ability to prioritise.
When should you re-run an audit on a website?
After a drop in clicks and CTR, prolonged stagnation, a template and navigation redesign, adding facets or a new section, or at regular intervals for higher-risk areas (catalogue, offer pages, editorial hubs).
How do you audit depth and internal links effectively?
Segment by page type, identify business pages that are too deep, find orphan pages, then build hubs and contextual links. Measure impact via impressions and clicks changes and URL stability in Search Console.
How do you fix duplicate content without losing rankings?
Select a reference page, consolidate content (or genuinely differentiate), then align internal linking to the target page. If you redirect, do so to the closest equivalent. Validate in Search Console (indexing, impressions, associated queries) over several weeks.
How do you detect cannibalisation and choose the right target page?
Look for URL swapping and split impressions in Search Console. Choose the target based on intent, ability to convert (Analytics), quality and freshness, and current internal linking. Then merge, differentiate or redirect depending on the case.
What are the key specifics for an e-commerce site?
Strict management of facets and parameters (duplication), control catalogue depth, clear arbitration between category vs product pages, and rules for out-of-stock and end-of-life products to preserve URL value.
How do you connect recommendations to measurable ROI (leads, pipeline, revenue)?
Link each recommendation to a KPI (CTR, clicks, conversions, revenue) and to a high-value page segment. Measure before and after over a window that matches SEO reality (often several weeks), and track impact on key events in Analytics (leads, add to basket, demo requests), as well as downstream quality if a CRM is available.
.png)
%2520-%2520blue.jpeg)

.jpeg)
.jpeg)
.avif)