Published 2025-02-22 · Seotific Team
📅 15 April 2025 · ⏱ 9 min read
Most SEO audits miss the point. They produce a list of 200 issues ranked by technical severity and leave you staring at a spreadsheet wondering where to start. A checklist that doesn't tell you which issues actually affect rankings is not an audit — it's a data dump.
This guide walks through all 60+ checks a proper SEO audit should cover in 2025, organised by ranking impact. Each check explains why it matters, not just what it is.
SEO tools are built to find problems, not to judge which ones matter. A missing alt tag on a decorative image and a misconfigured canonical tag both show up as "errors" — but one is cosmetic and one is actively splitting your page's authority across two URLs. Treating them equally is how agencies spend three weeks fixing low-impact issues while actual ranking problems sit untouched.
The right framing: which issues are costing ranking positions right now? The answer almost always falls into one of three root causes — Google can't crawl the page properly, Google can't understand what the page is about, or Google doesn't trust the page enough to rank it. Every audit check maps to one of these three.
Before anything else, Google has to be able to find, crawl, and index your page. All on-page optimisation is irrelevant if this layer fails.
A surprisingly common issue: a developer sets Disallow: / during a site rebuild and forgets to revert it after launch. Check your robots.txt manually. Verify Googlebot, Bingbot, and major AI crawlers (GPTBot, ClaudeBot, PerplexityBot) are allowed. If you're trying to build AI search visibility, blocking these crawlers is self-defeating.
Every page needs a self-referencing canonical tag. When it's missing — or points to a different URL (trailing slash vs none, www vs non-www, HTTP vs HTTPS) — you're telling Google there are multiple versions of this page. Google picks one. It might not be the one you want ranked.
Check every page's <meta name="robots" content="noindex"> tag. Staging sites deployed to production, thank-you pages made sitewide, CMS category pages set to noindex during a test — these are more common than they should be.
A single 301 redirect is fine. A chain of 3+ redirects loses PageRank at every hop and slows crawling. After any site migration, audit all redirects. Every chain should be collapsed to a single redirect from the original URL to the final destination.
Your sitemap should contain every canonical URL you want indexed, and nothing else. No noindex pages, no redirect URLs, no broken links. Submit it to Google Search Console and check indexed vs submitted. A large gap indicates crawling problems worth investigating.
Once Google can reach the page, it needs to understand what it's about and whether it matches the searcher's intent.
Exactly one H1 per page. It should contain your primary keyword naturally. Multiple H1s confuse Google's understanding of the primary topic. A missing H1 means Google is reading your content without a clear signal of what this page is fundamentally about.
The meta title is your primary CTR driver in the SERP. Primary keyword in the first 3–4 words. Under 60 characters or Google truncates it. Unique across every page — duplicate titles signal cannibalisation to Google.
Not a ranking factor — absolutely a CTR factor. A well-written meta description that matches the searcher's intent can double your click-through rate at the same ranking position, effectively doubling organic traffic without moving up a single position. Write it as a response to the search intent, not a description of the page.
Skipping heading levels (H1 to H3 with no H2) creates a broken document structure that both screen readers and Google's content parser depend on. Use H2 for major sections, H3 for subsections. Every long-form page should have at least 3 H2s — it signals depth and organisation.
Every important page needs internal links pointing to it from related pages. Orphaned pages receive no PageRank and are crawled less efficiently. Each page should link out to at least 3 related pages using descriptive anchor text — not "click here" or "read more."
Google's Helpful Content system evaluates whether pages demonstrate genuine expertise and actually satisfy searcher intent. This is where the audit goes beyond checklists into judgment.
The most critical content check. Search "best project management tools" — Google returns listicles. Search "what is kanban" — Google returns explanatory articles. If your content format doesn't match what Google is already rewarding for your keyword, you won't rank regardless of technical optimisation. Check the top 3 results before writing.
Write the first 80–120 words as a direct, keyword-present answer to the search query. This helps with featured snippets, AI Overviews, and user experience. People who immediately find the answer they were looking for are more likely to stay — improving engagement signals.
A 4–6 question FAQ at the bottom of every guide targets People Also Ask results and allows FAQPage schema for rich results. Use the actual questions from the PAA box for your keyword.
Experience, Expertise, Authoritativeness, Trustworthiness — Google's quality rater guidelines use these to assess whether a page should be trusted.
Named authors with verifiable credentials outperform anonymous "Team" attributions. Add a byline with name, relevant experience, and links to the author's professional profile. Critical for health, legal, and financial content — but beneficial across all topics.
About page, Contact page, and Privacy Policy are the minimum. Sites without these three are penalised in quality rater evaluations. The About page should include real information about who runs the site and why they're qualified.
ChatGPT, Perplexity, and Google's AI Overviews are increasingly used for research queries. Getting cited by these systems requires additional signals.
JSON-LD schema helps AI systems understand your content's structure and type. Minimum: Article schema on all blog posts, BreadcrumbList on all pages, FAQPage on all FAQ sections. Product, Review, HowTo, and Event schema where relevant.
The emerging standard for AI crawl control — equivalent to robots.txt but for large language models. Create a /llms.txt file describing your site's content and permissions for AI systems. Seotific's LLMs.txt generator builds this automatically.
After a 60-check audit you'll typically find 15–30 issues. The rule: fix crawlability problems first (canonical, noindex, redirect chains), then on-page signals for highest-priority pages, then content quality, then E-E-A-T, then AI signals. Technical problems block everything else — there's no point optimising a page Google can't crawl.
Seotific's Page Audit runs all 60+ checks and classifies every finding by ranking impact: Critical, High, Medium, or Cosmetic. You see immediately which 3–5 issues are worth fixing this week rather than working through a 200-item list with no context.
For active sites publishing regularly: monthly. For stable sites: quarterly. Every time you make significant structural changes (new CMS, site migration, URL restructure), run a full audit before and after. The before audit gives you a baseline. The after audit confirms nothing broke and shows what improved.
The most valuable insight from repeat audits isn't individual findings — it's the trend. A page that improves from 55/100 to 72/100 over three months while competitors stay static is moving toward the top 3. That's the signal that your strategy is working.
Seotific puts all of this analysis into action — from 60+ check page audits to AI-powered strategy recommendations, all in one tool.
Get Free Beta Access →