SEO Audit
SEO Audit Overview
What the SEO audit covers, how scoring works, and how to interpret the results summary.
What the audit covers
The Surfaceable SEO audit runs 12 independent checks across your site. Each check is a standalone module that fetches, analyses, and scores one dimension of your site's SEO health.
The 12 checks are:
| Check | What it analyses |
|---|---|
| Page SEO | Title, meta description, headings, canonical, hreflang |
| Indexability | robots.txt, noindex tags, redirect chains, sitemap |
| Schema / Structured Data | JSON-LD presence, type detection, Schema.org validation |
| Content | Word count, readability score, keyword density, heading structure |
| Links | Internal/external link map, broken link detection |
| HTTP Headers | Cache-Control, compression, HSTS, response time |
| Social Meta | Open Graph tags, Twitter Card tags |
| Core Web Vitals | LCP, INP, CLS, TTFB via PageSpeed Insights API |
| Security | Mozilla Observatory grade, security header audit |
| DNS | A/AAAA records, SPF, DMARC, SSL certificate validity |
| HTML Validation | W3C HTML5 compliance |
| Accessibility | WCAG 2.1 AA: images, forms, viewport, lang attribute |
Each check is documented in detail in The 12 SEO Checks Explained.
How scoring works
Every check returns a score from 0 to 100. The score is calculated by starting at 100 and deducting points based on issues found:
- Critical issue — deducts 20 points each
- Warning — deducts 10 points each
- Info — no deduction (informational only)
- Pass — confirms something is correct (no deduction)
The overall site score is the weighted average of all check scores across all crawled pages.
Severity levels
| Severity | Meaning | Action |
|---|---|---|
| Critical | A definite problem that is likely harming rankings | Fix immediately |
| Warning | A potential issue or missed opportunity | Fix when possible |
| Info | Neutral data point — not good or bad | Awareness only |
| Pass | Check passed with no issues | Nothing to do |
Crawl settings
By default an audit crawls up to 25 pages at a depth of 3 levels. On paid plans you can increase this up to 500 pages.
The crawler:
- Follows canonical redirects
- Respects
robots.txt(and reports disallowed paths as an issue) - Uses a standard User-Agent string so it behaves like a real crawler
- Captures both desktop and mobile signals where applicable