Tech for Retail 2025 Workshop: From SEO to GEO – Gaining Visibility in the Era of Generative Engines

Back to blog

SEO Analysis of a URL: An Actionable On-Page Method

SEO

Discover Incremys

The 360° Next Gen SEO Platform

Request a demo
Last updated on

2/4/2026

Chapter 01

Example H2
Example H3
Example H4
Example H5
Example H6

SEO analysis of a page: on-page and GEO method to diagnose, prioritise and improve a URL (updated April 2026)

 

For the overall framework covering technical SEO, content, authority and prioritisation, begin with the pillar article on website analysis. Here, we focus on the page level: a page-level SEO analysis helps you make fast, confident decisions on a specific URL without diluting the diagnosis into a site-wide average. It is also the level at which you can make your answers more easily "extractable" for generative AI engines (GEO).

 

Why this article complements "website analysis": staying at page level to move faster and be more accurate

 

Google's first page captures the majority of attention: 75% of clicks go to the top three organic results (SEO.com, 2026), whilst page two averages a 0.78% click-through rate (Ahrefs, 2025). In practice, moving a page from position 12 to 7 can shift acquisition more significantly than dozens of micro-fixes scattered across a site. Working at page level shortens the "hypothesis → change → observation" loop and reduces the risk of spending time on low-impact technical tickets.

 

Definition and scope: what a per-page analysis measures (Google) and what it must make "quotable" for AI (GEO)

 

A per-page analysis combines three layers: what Google can crawl and index, what it understands (topic, intent, structure), and what the SERP rewards (CTR, position, features). From a GEO perspective, the goal is for a model to reuse your content without ambiguity: clear definitions, stand-alone sentences, sourced numbers, lists, tables and Q&A sections that can be lifted as-is. With 60% of searches ending "without a click" (Semrush, 2025), visibility is no longer only about visits—it is also about appearing in the answer shown.

 

When to trigger a page-by-page review: falling CTR, positions 8–20, redesigns, cannibalisation, templates

 

Run a page-level analysis when a performance signal shifts (impressions, CTR, position) or when an internal change may cause regression (redesign, new template, migration, block additions). Positions 8 to 20 are a priority case: you are close to the top 10, and therefore close to where clicks concentrate.

  • CTR drops whilst impressions rise.
  • A URL is stuck in positions 8–20 for a revenue-driving query.
  • Two pages compete for the same intent (cannibalisation).
  • A template (product page, category, landing page) is rolled out across dozens of URLs.

 

Framing the analysis: choosing the right page, intent and baseline

 

 

Selecting a high-potential page: business value, upside and regression risk

 

Before opening a crawler, choose the URL like an investment: value, upside, risk. The traffic difference between the 1st and 5th positions can be a factor of four (Backlinko, 2026): a small uplift on a money page is often worth more than a broad clean-up.

Criterion Signal to check Why it matters
Business value Conversions, MQL/SQL, demo requests, basket value You prioritise by impact, not by ticket volume.
Room to grow Positions 8–20, CTR below the position average Gains tend to come faster when you are already visible.
Risk Already top 3, pillar page, heavily internally linked URL You avoid breaking an organic asset.

 

Set a primary intent and a page promise: SERP alignment, angle and implicit expectations

 

A page performs when it matches a dominant intent: learn, compare, choose, buy, solve a problem. Validate alignment through SERP analysis: content types, highlighted formats, and the shared angle across top results. For GEO, write the "promise" as a testable sentence: if a model summarises the page, what should it say in 15 words without distorting the meaning?

 

Build a usable baseline: impressions, clicks, CTR, position, conversions (before any change)

 

Without a baseline, you confuse improvement with statistical noise. Measure before editing: impressions, clicks, CTR and average position (Search Console), then conversions (GA4/your analytics tool). Google updates its algorithm 500 to 600 times per year (SEO.com, 2026): documenting "before/after" protects decision-making.

  1. Export 28 to 90 days of data (depending on seasonality).
  2. Segment by device (mobile accounts for 60% of global web traffic, Webnyxt, 2026).
  3. Note your main queries and those growing fastest.
  4. Capture the current state of the page (rendered HTML, title, headings, main blocks).

 

Priority on-page checks: what Google understands (and what AI reuses)

 

 

Title and meta tags: structure, natural keywords, differentiation and a click-worthy promise

 

The title tag remains your number one lever on CTR—and therefore on turning impressions into clicks. Question-style titles average +14.1% CTR (Onesty, 2026): useful when the SERP rewards direct answers. Aim for an explicit promise (benefit, scope, update) and clear differentiation versus other site pages to reduce cannibalisation.

  • One intent = one title (avoid "catch-all" titles).
  • Add a proof element when possible (method, checklist, comparison, "updated").
  • Keep wording natural: no keyword stuffing.

 

Meta description: improve CTR without overpromising

 

The meta description is not a direct ranking factor, but it strongly influences click quality. If it promises more than the page delivers, you pay for it: pogo-sticking, lower engagement, and weaker behavioural signals. Treat it like a mini-brief: who it is for, what it solves, and what the reader will get.

Page type Meta description angle GEO: "reusable" bonus
Article Promise + deliverable (checklist, method) Announce lists, steps, definitions.
Landing page Problem → solution → proof Precise terms, measurable benefits when sourced.
Category Choice + criteria + range Structured attributes (ranges, use cases).

 

Heading structure (H2, H3): hierarchy, readability and answer extractability

 

Good heading structure helps Google understand your outline—and helps AI extract coherent blocks. Think "implicit questions": each H2 should answer a stable question, each H3 should close out a sub-question with an actionable takeaway. Lists (ul/ol) and tables make your answers easier to quote, especially when users want a process.

 

Common issues: redundant sections, unreadable "SEO headings", missing definitions

 

  • Different H2s saying the same thing (redundancy → dilution).
  • Vague headings ("About", "Our solutions") that express no intent.
  • No definition early on, even though the SERP favours educational content.
  • Level jumps (H2 → H4) without logic.

 

Content optimisation: completeness, evidence, freshness and precision (without padding)

 

A page rarely improves because it is longer; it improves because it is more useful, more precise, and better structured. The average length of a top-10 Google result is 1,447 words (Webnyxt, 2026): use it as a benchmark, not a target. Add evidence (numbers, sources, examples) and update passages that are now outdated or vague.

  1. Fill missing sub-topics seen across the top 10, without copying their angle.
  2. Replace generalities with decision-ready criteria (checklists, thresholds, steps).
  3. Add 1–3 reliable sources whenever you state a number or rule.
  4. Clarify definitions and acronyms at first mention.

 

Keyword density and placement: helpful guardrails, why ratios mislead, and better alternatives (entities, variants, co-occurrences)

 

Keyword-density ratios can feel reassuring, but they often lead to poor decisions: visible repetition, artificial headings, degraded style. Prefer intent-led semantic coverage: entities, natural variants, co-occurrences and concrete examples. To frame research properly, start with keyword analysis and ensure every section serves a real sub-need.

  • Useful rule of thumb: if repetition is obvious to a human reader, you have already gone too far.
  • Alternative: list 10–20 related terms (tools, standards, metrics, steps) and assign them by section.
  • GEO: favour definitional phrasing ("X is…", "You measure Y with…") that is easy to quote.

 

Performance and experience: page speed, stability, and SEO & GEO impact

 

 

Measure accurately: field data vs lab tests, and practical thresholds

 

Use two lenses for performance: lab tests (diagnosis) and field data (real user experience). Slow pages quickly become expensive: 40% to 53% of users leave a site if loading is too slow (Google, 2025). And an extra two seconds can increase bounce rate by +103% (HubSpot, 2026).

  • Lab tests: pinpoint what is blocking (scripts, images, CSS).
  • Field data: prioritise pages that are genuinely problematic (mobile vs desktop, country).
  • Red flag: if mobile UX degrades, treat it first.

 

What most often slows a page down: images, scripts, fonts, above-the-fold bloat

 

The same causes come up again and again: oversized media, non-essential JavaScript, too many font files, and heavy above-the-fold layouts. In an SEO checker, these are amongst the typical crawl-detected issues: slow pages, oversized images, broken links, missing tags (Ahrefs, SEO Checker).

  1. Compress and correctly size images (modern formats, sensible lazy-loading).
  2. Delay or remove non-critical scripts (marketing tags, widgets).
  3. Rationalise fonts (weights, variants, loading strategy).
  4. Lighten the top of the page (hero, carousels, autoplay video).

 

Balancing UX and performance: protect conversion whilst improving the page

 

Performance work should not undermine conversion. The goal is to speed things up without removing what reassures and proves value. Work through substitutions: replace a heavy component with a lighter equivalent, or move a block below an anchor point.

UX element Performance risk Often-winning compromise
Hero video Initial load time Poster image + play on click.
Third-party widgets Render-blocking scripts Defer loading until after interaction.
Carousel JS + multiple assets Static block + clear CTA.

 

Diagnosis and decisions: interpreting a page analysis to take action (not just to tick boxes)

 

 

High impressions, low CTR: improve the snippet, intent match and editorial angle

 

If impressions rise but CTR stays flat, you likely have a promise problem (title/meta), a SERP competition issue, or poorly framed intent. The top organic result can capture 34% of desktop clicks (SEO.com, 2026): your snippet is a business lever, not a formality.

  • Compare your title/meta to the top five results (differentiation, proof, freshness).
  • Check whether the SERP is favouring a format (guide, comparison, definition, video).
  • Align your introduction with the promise (avoid a "let-down" effect).

 

Positions 8–20: strengthen useful depth, evidence and structure

 

In positions 8–20, Google is already considering you. Now you must demonstrate why your page deserves more. Strengthen what the best pages do better: criteria, examples, definitions and direct answers.

  1. Add a "how to" or "checklist" section when intent is procedural.
  2. Introduce sourced evidence and numeric benchmarks where they exist.
  3. Structure for GEO extraction (lists, tables, micro-answers).

 

Traffic decline: isolate the likely cause (content, technical, competition, seasonality)

 

A traffic drop is not always an "SEO problem": it may come from seasonality, a shift in intent, or a SERP that has moved towards different formats. Start by documenting the timeline: page edits, template changes, tracking rollouts, performance incidents.

  • Content: outdated information, loss of freshness, an angle that is no longer dominant.
  • Technical: indexing, canonicals, redirects, slowdowns, errors.
  • Competition: new entrants, more complete pages, richer SERP features.
  • Seasonality: compare with the same period year-on-year where possible.

 

An underperforming page: improve, consolidate, redirect or deindex (depending on the URL's role)

 

A page can be technically clean and still deliver nothing if it targets no clear intent—or if the topic is already covered elsewhere. Decide based on the URL's role: acquisition, conversion, support, reassurance, internal linking.

Situation Decision Goal
Strong intent, weak content Improve Win positions and CTR.
Two similar pages Consolidate Avoid cannibalisation and concentrate signals.
Outdated URL with an equivalent Redirect Transfer value and simplify the index.
No value, no role Deindex Reduce noise and allocate crawl budget better.

 

Prioritise and execute: turning a URL review into an action plan

 

 

Impact × effort × risk matrix: decide fast, without bias

 

Tools can surface dozens (or hundreds) of alerts for a single URL; not all deserve a ticket. Adopt an "importance-sorted" logic, similar to how some checkers categorise issues and propose direct fixes (Ahrefs, SEO Checker).

Dimension Question Decision example
Impact Can it move indexing, CTR, ranking or conversion? Rewrite title/meta on a high-impression page.
Effort How much time, how many dependencies, what release cycle? Swap an image vs redesign a template.
Risk Likelihood of SEO/UX regression? Test on a pilot URL before rolling out a template.

 

Typical quick wins: title, meta description, headings, targeted enrichments, internal linking

 

  • Rephrase the title to better match intent and stand out.
  • Rewrite the meta description to lift CTR, without overselling.
  • Clean up heading structure (one H1, stable H2s, useful H3s).
  • Add a definition, checklist, comparison table, or step-by-step section.
  • Strengthen internal links pointing to the URL (internal linking opportunities).

 

A 2 to 6-week iteration plan: production, validation, QA and tracking

 

A useful page analysis ends with a short, sequenced plan that is easy to validate. You can iterate without changing everything: one batch for "snippet + structure", then "content + evidence", then "internal linking + performance".

  1. Week 1: baseline + hypotheses + quick-win changes.
  2. Weeks 2–3: targeted enrichments (sections, evidence, definitions, tables).
  3. Weeks 3–4: internal linking + light technical adjustments.
  4. Weeks 4–6: QA, SERP tracking, corrections if side effects appear.

 

Measuring before/after: observation windows, normal variation and success criteria

 

Measure what you actually changed: CTR after snippet edits, rankings after enrichment, engagement after UX improvements. For crawls, some tools let you schedule daily, weekly or monthly analyses to spot issues as soon as they appear (Ahrefs, SEO Checker).

  • CTR: observe over 7–14 days when impressions are sufficient.
  • Rankings: typically 2–6 weeks depending on crawl frequency and competition.
  • Success: set one binary target (e.g. reach top 10) plus one business metric (lead, conversion).

 

Tools for page SEO analysis: what they do well, and where they fall short at scale

 

 

Crawling and technical diagnostics: where expert tools can quickly become time-intensive

 

Screaming Frog remains a standard for detailed crawls, but it requires genuine expertise and does not provide an end-to-end chain by itself (prioritisation, execution, tracking). Some checkers can detect classic on-page issues (missing titles or meta descriptions, broken links, slow pages, duplication) and sort them by importance (Ahrefs, SEO Checker). A frequent limitation at scale: you get lists, not multi-team orchestration.

 

Semantic analysis and content optimisation: avoiding generic output and "average" recommendations

 

Surfer SEO can help optimise copy through semantic recommendations, but without brand-personalised AI, the risk is producing "average" content that is interchangeable. Moz offers a "score + checklist" approach (On-Page Grader) in three steps (URL + keyword → score → fixes), useful for structure, but limited for scaling planning and production. The key—especially for GEO—is turning recommendations into structured, reliable and genuinely differentiated answers.

 

Market and competitor data: useful insight, but often read-only and not very collaborative

 

Semrush offers an On Page SEO Checker that provides ideas based on competitor analysis, with two levels: a multi-page view and a detailed per-page report (Semrush, On Page SEO Checker). It is powerful for benchmarking the SERP (top 10) and getting categories of ideas (technical, semantic, UX, SERP features), but the experience can become complex and geared towards reading rather than workflows. Ahrefs excels in links and health reporting (Health Score) and states it checks 140+ SEO issues via AWT (Ahrefs, SEO Checker), with visible usage limits (5,000 crawl credits per project per month, and 1,000 backlinks and 1,000 keywords shown "at once").

For a quantified view of dynamics (CTR, page two, mobile, zero-click), use the consolidated benchmarks in our SEO statistics. The goal is not to stack platforms, but to connect diagnosis, prioritisation and execution.

 

Scaling page-by-page analysis without losing precision

 

 

Standardise an analysis grid by page type: article, offer page, landing page, category

 

An offer page should not be assessed like an article: it must reassure and convert whilst remaining indexable and readable. Build a grid by type, with both SEO and GEO criteria, then apply it consistently.

  • Article: definition, method, evidence, FAQ, internal links to business pages.
  • Landing page: promise, objections, proof, performance, CTA.
  • Category: selection criteria, attributes, unique content, controlled pagination/filtering.
  • Offer page: differentiation, use cases, results, trust elements.

 

Group by templates: fix once, improve dozens of pages

 

If an issue comes from the template, case-by-case optimisation will slow you down. Group by template (product, category, article) and list what repeats: generated titles, headings, image weight, scripts, structured data. This is often the best impact-to-effort ratio on large sites.

 

Set up monitoring: alerts for CTR, indexing, performance and regressions

 

Monitoring avoids panic audits and turns analysis into routine. Schedule regular checks: some tools support planned daily, weekly or monthly crawls (Ahrefs, SEO Checker), useful for catching regressions quickly.

Signal Alert Immediate action
CTR Sustained drop on a high-impression page Rework title/meta + intent alignment.
Indexing Drop in valid URLs / anomalies Check canonicals, robots, redirects.
Performance Mobile degradation Identify the last addition (script, media, component).
Rankings Drop on key queries Compare SERPs, formats and competitor updates.

 

A word on Incremys: connect SEO and GEO, then move from analysis to execution

 

 

Centralise URL-level signals, prioritise with a business lens, and produce actionable recommendations

 

Many teams juggle crawlers, semantic tools and spreadsheets: the diagnosis exists, but execution is fragmented. Incremys' approach is to connect SEO and GEO signals at URL level, then turn analysis into tracked actions (prioritisation, production, validation, reporting) with a more collaborative workflow than a purely read-only database. If you compare tools, keep common limitations in mind: Semrush can become complex and mainly geared towards consultation, Ahrefs is outstanding for links but does not cover content creation, Screaming Frog is expert and technical, Moz is more of a "checker + checklist", and Surfer SEO optimises without brand-personalised AI.

 

FAQ on SEO page analysis

 

 

How do you analyse a page for SEO, step by step?

 

  1. Choose a high-potential URL (business value + positions 8–20 + CTR upside).
  2. Set the primary intent by reviewing the SERP (formats and expectations).
  3. Establish a baseline (impressions, clicks, CTR, position, conversions).
  4. Check the snippet (title/meta), heading structure, content completeness and internal linking.
  5. Measure performance (mobile first) and fix major blockers (images, scripts).
  6. Prioritise via impact × effort × risk, execute, then measure before/after.

 

What elements should you check on a page to improve its rankings?

 

  • Title and meta description (promise, differentiation, alignment with content).
  • Heading structure (clear hierarchy, informative titles, visible definitions).
  • Intent alignment (does the page match what the SERP mainly rewards?).
  • Content (evidence, precision, freshness, missing sections vs top 10).
  • Performance (slow mobile, third-party scripts, heavy images).
  • Internal linking (relevant inbound internal links, reasonable depth).

 

How do you improve a page's SEO score without over-optimising?

 

Start with what has measurable impact: CTR (snippet), intent match, structure and evidence. Avoid artificial repetition and word-ratio targets; they hurt readability and do not increase relevance. Aim for a page that is clearer, more useful and easier to verify (sources, examples, lists).

 

How do you optimise a title tag and meta description to increase CTR?

 

  • Write a benefit- and intent-led title (not a string of terms).
  • Differentiate from competing results (method, angle, update, proof).
  • Test a question-style title when intent is informational (Onesty, 2026: +14.1% average CTR).
  • Write a meta description that accurately reflects the page (avoid broken promises).

 

What keyword density should you aim for in 2026?

 

There is no universal "right" density; chasing a percentage often causes padding. In 2026, a more robust approach is to cover intent using natural variants, entities and co-occurrences, then structure answers to be quotable for GEO. If you need to choose, choose readability: copy that "sounds optimised" usually performs worse.

 

How do you analyse heading structure and spot an inconsistent hierarchy?

 

  • Check there is only one H1 and it expresses the main intent.
  • Ensure each H2 answers a distinct question (no duplicates).
  • Avoid level jumps (H2 → H4) without a reason.
  • Rewrite vague headings into informative ones (definitions, steps, criteria).

 

How do you assess page speed and decide what to optimise first?

 

  1. Compare field data and lab tests to avoid false diagnoses.
  2. Prioritise mobile (60% of global web traffic, Webnyxt, 2026).
  3. Start with images and third-party scripts, common causes of slowness.
  4. Track the business impact: too slow = abandonment (Google, 2025) and higher bounce (HubSpot, 2026).

 

How do you diagnose a page with lots of impressions but few clicks?

 

  • Review your snippet against the top five (promise, angle, proof, freshness).
  • Check whether the SERP has changed (features, AI overviews, dominant format).
  • Align the page introduction with the title's promise.
  • Test a rewrite, then measure CTR over a stable period.

 

How do you know whether content matches search intent (SEO) and is reusable for AI (GEO)?

 

For SEO, compare your outline and answers with dominant SERP formats: does Google expect a definition, a guide, a comparison, or a transactional page? For GEO, check "quotability": single-sentence definitions, step lists, criteria tables, sourced numbers, and sections that stand on their own. If a paragraph cannot be understood out of context, it is unlikely to be reused well.

 

How often should you analyse a site's key pages?

 

For business-critical pages, aim for a regular cadence (monthly or quarterly) plus ongoing monitoring of signals (CTR, indexing, performance). Some tools offer scheduled daily, weekly or monthly crawls (Ahrefs, SEO Checker): useful for spotting regressions early, especially after releases. Adjust frequency to site size and publishing pace.

 

Which tools should you use for reliable on-page analysis without stacking platforms?

 

Combine a "Google data" foundation (Search Console + analytics) with a crawler and a semantic analysis tool, then reduce overlap. To map your options, start with our guide to SEO tools and choose based on your ability to execute (not just how detailed the reports are). A simple "checker + CSV export" stack rarely holds up once you need collaboration, production and large-scale rollouts.

 

How do you avoid cannibalisation when multiple pages target a similar topic?

 

  • Define a single intent per URL (and reflect it in the title and H1).
  • Consolidate when two pages compete for the same query (merge + redirect if needed).
  • Create an internal linking plan that clearly indicates the "reference" page.
  • Monitor ranking swings between URLs on the same queries.

 

How do you prove the business impact of page-level optimisation (leads, conversions, pipeline)?

 

Tie each optimisation to a measurable KPI: CTR (snippet), ranking (visibility), conversions (business). The strongest proof is a before/after comparison with a baseline, segmented (mobile/desktop, country, intent) and correlated with conversion events. To tighten measurement, you can also consolidate findings with web analytics to separate visibility gains from on-page performance gains.

For more operational guides at the same level of precision, explore the Incremys blog.

Discover other items

See all

Next-Gen GEO/SEO starts here

Complete the form so we can contact you.

The new generation of SEO
is on!

Thank you for your request, we will get back to you as soon as possible.

Oops! Something went wrong while submitting the form.