Most sites don’t have a content problem. They have a foundation problem.
You can publish the most thorough, well-researched article in your niche and still watch it sit on page four — not because the content is weak, but because Google’s crawlers can’t properly access, interpret, or trust your site. That’s a technical SEO failure, and it’s more common than most business owners realize.
This technical SEO checklist was built for 2026’s search landscape — one where Google’s crawl budget is tighter, Core Web Vitals are a confirmed ranking signal, and AI-driven search results are pulling from structured, well-organized data sources. Whether you’re doing your first audit or tightening up a site that’s already performing well, these 27 items give you a clear, action-oriented roadmap.
We’ve organized everything into five categories: crawlability, indexability, site speed, security, structured data, and mobile. For each item, you’ll know what to check, what tool to use, and how to fix what’s broken.
Technical SEO Checklist: Crawlability (Items 1–7)
Before Google can rank your pages, it has to find them. Crawlability is the most foundational layer of technical SEO — and the most commonly broken one.
1. Robots.txt Is Configured Correctly
Your robots.txt file tells search engine bots which parts of your site they can and can’t crawl. A misconfigured file can silently block your most important pages from ever being indexed.
How to check: Navigate to yourdomain.com/robots.txt in your browser. Look for Disallow: / rules that might be blocking key sections. Run it through Google Search Console’s robots.txt tester for a more detailed view.
How to fix: Make sure you’re only disallowing directories that genuinely shouldn’t be indexed — admin panels, staging subfolders, duplicate parameter URLs. Never block your CSS and JavaScript files; Google needs to render your pages to evaluate them.
2. XML Sitemap Is Submitted and Error-Free
An XML sitemap is your direct communication to Google about which pages exist and when they were last updated. Without one, crawlers rely entirely on link discovery — slower, less complete, and more prone to missing important pages.
How to check: Go to Google Search Console → Sitemaps. If your sitemap isn’t listed, submit it. If it is, check for errors flagged in the report.
How to fix: Generate a clean sitemap using a plugin (Yoast, Rank Math) or tool like Screaming Frog. Exclude noindex pages, pagination URLs, and any pages returning non-200 status codes. Keep the sitemap updated automatically.
3. No Orphan Pages
Orphan pages have no internal links pointing to them. Even if they’re in your sitemap, they’re invisible to crawlers following your site’s natural link graph — and invisible pages rarely rank.
How to check: Export your sitemap URLs and your crawl data from Screaming Frog, then cross-reference. Any URL in the sitemap that has zero inlinks is an orphan.
How to fix: Add internal links from relevant, high-traffic pages to any orphaned content. If a page has no logical place in your internal linking structure, evaluate whether it should exist at all.
4. Crawl Depth Is Reasonable
Pages buried five or six clicks deep from the homepage receive significantly less crawl attention. In large sites, Google often won’t reach them at all within a given crawl cycle.
How to check: In Screaming Frog, the “Crawl Depth” column shows exactly how many clicks each URL is from the homepage.
How to fix: Flatten your site architecture. Important pages should be reachable within three clicks from the homepage. Use hub pages and breadcrumbs to create logical, shallow paths.
5. Redirect Chains Are Cleaned Up
A redirect chain occurs when URL A redirects to URL B, which redirects to URL C. Each hop wastes crawl budget and dilutes link equity. Chains longer than two hops are a signal of poor site hygiene.
How to check: Screaming Frog’s redirect report shows full chains. Ahrefs’ Site Audit also flags these automatically.
How to fix: Update redirect chains so that every redirect goes directly to the final destination URL in a single 301.
6. No Broken Internal Links (4xx Errors)
Internal links pointing to 404 or 410 pages are wasted crawl budget — and a bad user experience. Google notices both.
How to check: Run a Screaming Frog crawl and filter for response codes 4xx. Check the “Inlinks” tab to see which pages contain the broken links.
How to fix: Either update the internal link to point to the correct URL, or redirect the broken destination to the most relevant live page.
7. Crawl Budget Is Not Being Wasted
Crawl budget matters most for large sites (10,000+ pages), but smaller sites can waste it too — through URL parameters, faceted navigation, session IDs, and duplicate content generating thousands of unique-looking URLs.
How to check: Google Search Console’s “Crawl Stats” report shows how many pages Google is crawling per day and what it’s spending time on.
How to fix: Block parameter-generated duplicates via robots.txt or canonical tags. Use Google Search Console’s URL Parameters tool to tell Google which parameters don’t create unique content.
Technical SEO Checklist: Indexability (Items 8–12)
Getting crawled and getting indexed are two different things. These items ensure Google doesn’t just find your pages — it actually adds them to the index.
8. No Unintentional Noindex Tags
A single <meta name="robots" content="noindex"> tag on a page you want ranked will keep it out of search results permanently. This happens more often than you’d expect — especially after migrations, staging environment configs, or plugin changes.
How to check: In Screaming Frog, filter by “Directives” → noindex. Cross-reference against pages you expect to be indexed.
How to fix: Remove the noindex tag from any page you want Google to index. If you’re using a CMS like WordPress, check the SEO plugin settings and the “Search Engine Visibility” toggle in Settings → Reading.
9. Canonical Tags Are Implemented Correctly
Canonical tags tell Google which version of a page is the “original” when multiple URLs return similar or identical content. Without them, you risk splitting ranking signals across duplicates.
How to check: Inspect the <head> of any page and look for <link rel="canonical" href="...">. Screaming Frog exports canonical data in bulk.
How to fix: Every indexable page should have a self-referencing canonical. For true duplicates (HTTP vs HTTPS, www vs non-www, trailing slash variations), canonicalize to one consistent version. Never canonical a page to a noindex URL.
10. Hreflang Is Set Up Properly for Multilingual Sites
If your site serves multiple languages or regions, hreflang tags tell Google which version to show to which audience. Misconfigured hreflang is one of the most common international SEO mistakes.
How to check: Screaming Frog has a dedicated hreflang report. Ahrefs Site Audit flags hreflang errors including missing reciprocal links and incorrect language codes.
How to fix: Every hreflang implementation must be reciprocal — if page A points to page B as the German version, page B must point back to page A as the English version. Use ISO 639-1 language codes (e.g., en, de, fr) and ISO 3166-1 country codes where needed (e.g., en-US, en-GB).
11. Duplicate Content Is Resolved
Duplicate content doesn’t just dilute your rankings — it confuses Google about which version to show, often resulting in neither version ranking well.
How to check: Siteliner scans for internal duplicate content. Copyscape handles external duplication. Screaming Frog’s “Duplicate Content” report flags near-duplicate page titles and meta descriptions.
How to fix: Consolidate duplicate pages with 301 redirects, canonical tags, or by rewriting content to differentiate them. For e-commerce, this is especially critical for product variants and filtered category pages.
12. Index Coverage Report Is Reviewed Regularly
Google Search Console’s Index Coverage report shows exactly which pages are indexed, which are excluded, and why. It’s one of the most useful and most ignored tools in the technical SEO toolkit.
How to check: Google Search Console → Index → Coverage. Review the “Excluded” tab carefully — look for patterns in why pages are being left out.
How to fix: Address errors (server errors, redirect errors, not found) first. Then evaluate “Excluded” categories: pages excluded by noindex are intentional; pages marked “Discovered — currently not indexed” may need more internal links or content improvements.
Technical SEO Checklist: Site Speed & Core Web Vitals (Items 13–19)
Site speed and Core Web Vitals are confirmed Google ranking factors. More importantly, they directly affect whether visitors stay on your site or leave. A slow site loses business — period.
13. Largest Contentful Paint (LCP) Under 2.5 Seconds
LCP measures how long it takes for the largest visible element on the page to load. For most sites, that’s a hero image or headline. Google’s threshold for “Good” is under 2.5 seconds.
How to check: Google PageSpeed Insights (free), Lighthouse in Chrome DevTools, or the Core Web Vitals report in Google Search Console.
How to fix: Optimize and compress your hero images, use a CDN, implement lazy loading for below-the-fold images, and preload your LCP element with <link rel="preload">. Eliminate render-blocking resources above the fold.
14. Cumulative Layout Shift (CLS) Under 0.1
CLS measures visual stability — how much elements shift around as the page loads. A button that moves when an ad loads in above it is a classic CLS offender.
How to check: Lighthouse’s CLS score or the CrUX data in Google Search Console.
How to fix: Always define explicit width and height attributes on images and video embeds. Reserve space for ads and dynamically injected content. Avoid inserting content above existing content after page load.
15. Interaction to Next Paint (INP) Under 200ms
INP replaced First Input Delay as a Core Web Vital in 2024 and measures how quickly your page responds to user interactions — clicks, taps, keyboard inputs. This is often overlooked, but it’s now a significant part of the Core Web Vitals scoring.
How to check: Chrome User Experience Report (CrUX), PageSpeed Insights, or the Web Vitals Chrome extension.
How to fix: Reduce JavaScript execution time, break up long tasks, defer non-critical JavaScript, and use a web worker for complex operations.
16. Images Are Optimized and in Next-Gen Formats
Unoptimized images are the single biggest cause of slow load times for most sites. Switching to WebP or AVIF and implementing proper compression can cut image payload by 30–70%.
How to check: PageSpeed Insights flags oversized images. Screaming Frog reports image file sizes across the whole site.
How to fix: Compress images before upload using Squoosh or ShortPixel. Convert to WebP format. Use srcset to serve appropriately sized images for different screen sizes.
17. Render-Blocking Resources Are Eliminated
CSS and JavaScript in the <head> that blocks rendering delays everything the user sees. Every millisecond of render-blocking time directly impacts your LCP score.
How to check: Lighthouse’s “Eliminate render-blocking resources” opportunity shows exactly which files are causing delays and how much time they’re costing.
How to fix: Defer non-critical JavaScript with the defer or async attribute. Inline critical CSS and load the rest asynchronously. Audit and remove unused CSS and JavaScript.
18. Server Response Time (TTFB) Is Under 600ms
Time to First Byte (TTFB) is how long it takes the server to respond to a request. Poor TTFB drags down every other speed metric.
How to check: GTmetrix provides a clear TTFB reading. WebPageTest gives deep-dive server timing analysis.
How to fix: Upgrade to a quality hosting provider, implement server-side caching, use a CDN to serve assets from geographically closer locations, and optimize database queries on dynamic sites.
19. Caching Is Properly Configured
Browser caching tells returning visitors’ browsers to store static assets locally, dramatically reducing load times on repeat visits.
How to check: Lighthouse flags missing or misconfigured cache policies. GTmetrix shows each asset’s cache expiry.
How to fix: Set long cache-control headers for static assets (CSS, JS, images) — typically one year. For assets that change frequently, use cache-busting through versioned filenames or query strings.
Technical SEO Checklist: Security (Items 20–22)
Security signals affect both rankings and user trust. These are non-negotiables.
20. HTTPS Is Enabled Site-Wide
Google has used HTTPS as a ranking signal since 2014. Any page still serving over HTTP is sending a clear trust signal in the wrong direction — and most modern browsers will flag it as “Not Secure” to visitors.
How to check: Navigate to http://yourdomain.com (without the S) and confirm it redirects to HTTPS. Check for mixed content warnings in Chrome DevTools (Security tab).
How to fix: Install an SSL certificate (free via Let’s Encrypt, or provided by most hosting providers). Set up 301 redirects from all HTTP URLs to HTTPS equivalents. Update internal links and canonical tags to use HTTPS.
21. No Mixed Content Warnings
Even on an HTTPS site, if any resources (images, scripts, stylesheets) are loaded over HTTP, browsers will flag a mixed content warning — and Google takes note.
How to check: Chrome DevTools → Security tab. Also check the “https” filter in Screaming Frog for any non-HTTPS resources.
How to fix: Update all hardcoded HTTP URLs in your content and templates to HTTPS. Use a plugin like Really Simple SSL to catch remaining references.
22. Security Headers Are Configured
HTTP security headers (Content-Security-Policy, X-Frame-Options, Strict-Transport-Security) don’t directly boost rankings, but they signal a well-maintained, secure site — and they protect your users.
How to check: SecurityHeaders.com provides a free audit and letter grade.
How to fix: Add the appropriate headers at the server level (Apache .htaccess, Nginx config) or via your CDN’s settings panel. HSTS (Strict-Transport-Security) should be a priority if you’re fully committed to HTTPS.
Technical SEO Checklist: Structured Data (Items 23–25)
Structured data helps Google understand what your content is about and unlocks rich results in SERPs — review stars, FAQ dropdowns, event listings, product prices. It won’t boost your core rankings directly, but it dramatically increases click-through rates.
23. Schema Markup Is Implemented for Key Page Types
Every site should have Organization and WebSite schema at the site level. Beyond that, the schema you add depends on your content: Article, Product, FAQ, HowTo, LocalBusiness, Review — each unlocks different rich result types.
How to check: Google’s Rich Results Test (search.google.com/test/rich-results) validates schema on any URL. Ahrefs Site Audit has a dedicated structured data report.
How to fix: Implement schema in JSON-LD format (Google’s preferred method). Use Google’s Schema Markup Helper to generate the code if you’re starting from scratch. Place the JSON-LD block in the <head> of each relevant page.
24. No Schema Validation Errors
Broken schema is worse than no schema — it can trigger manual actions if you’re marking up content that misrepresents the page.
How to check: Google Search Console → Enhancements shows schema errors and warnings across your entire site, broken down by schema type.
How to fix: Address required property errors first (these prevent rich results). Then fix recommended property warnings (these improve the quality of your rich results). Revalidate with the Rich Results Test after each fix.
25. Breadcrumb Schema Is in Place
Breadcrumb schema improves how your URL appears in search results — instead of showing the raw URL, Google shows a clean breadcrumb trail. It also reinforces your site’s hierarchy to crawlers.
How to check: Rich Results Test → look for BreadcrumbList in the schema detected.
How to fix: Add BreadcrumbList schema to every page below the homepage. Most SEO plugins (Yoast, Rank Math) handle this automatically if breadcrumbs are enabled in your theme.
Technical SEO Checklist: Mobile Optimization (Items 26–27)
Google uses mobile-first indexing for all sites. Your mobile experience is your primary ranking experience.
26. Site Passes Mobile-Friendliness Test
Pages with text too small to read, clickable elements too close together, or content wider than the screen are flagged as mobile-unfriendly — a clear negative ranking signal.
How to check: Google’s Mobile-Friendly Test (search.google.com/test/mobile-friendly) tests any URL. Google Search Console → Mobile Usability shows site-wide issues.
How to fix: Use a responsive design framework. Set the viewport meta tag: <meta name="viewport" content="width=device-width, initial-scale=1">. Test tap target sizes — Google recommends at least 48×48 pixels with 8px between targets.
27. Core Web Vitals Pass on Mobile Separately
Desktop and mobile CWV scores are evaluated independently. A site that passes on desktop may still fail on mobile — and since Google indexes mobile first, mobile performance is what actually counts.
How to check: PageSpeed Insights and Google Search Console’s Core Web Vitals report both show separate mobile and desktop scores.
How to fix: Mobile performance is usually hurt by larger JavaScript payloads, unoptimized images for smaller screens, and slower network conditions. Use srcset for responsive images, aggressively defer JavaScript on mobile, and test on real mobile devices using Lighthouse in Chrome DevTools (set to “Mobile” mode).
Putting It Together: How to Run Your Technical SEO Audit
Running this checklist effectively requires a systematic approach, not random spot-checks. Here’s the workflow that professional auditors use:
Start with a full site crawl in Screaming Frog (or Sitebulb for larger sites). Export all URLs, response codes, title tags, meta descriptions, canonical tags, and internal link data. This is your baseline — the foundation every other check builds on.
Next, cross-reference your crawl data with Google Search Console. The Coverage report, Core Web Vitals report, and Mobile Usability report give you field data — actual Google bot behavior on your site — that crawl tools can’t replicate. Pay particular attention to the gap between what your crawler finds and what Google has indexed.
Then layer in PageSpeed Insights and Lighthouse for speed analysis, the Rich Results Test for schema validation, and SecurityHeaders.com for your security posture. Each tool illuminates a different dimension of your site’s technical health.
Document every finding in a prioritized audit spreadsheet: impact (high/medium/low), effort to fix (high/medium/low), and owner (developer/marketer/designer). Group quick wins separately from structural changes that require development resources. That prioritization step is what separates an audit that drives results from one that gets filed away and forgotten.
FAQ: Technical SEO Questions Answered
Technical SEO is the process of optimizing a website’s infrastructure so that search engines can efficiently crawl, render, index, and rank its pages. Unlike content SEO (which focuses on keywords and writing) or link building (which focuses on external authority), technical SEO deals with the underlying code and configuration of your site. Fixing it involves a systematic audit — using tools like Screaming Frog, Google Search Console, and Ahrefs — to identify and resolve issues with crawlability, site speed, security, structured data, and mobile optimization. The fixes range from editing your robots.txt file to implementing schema markup to compressing images.
For most sites, a full technical SEO audit once per quarter is reasonable. For larger sites with frequent content changes, monthly crawls are better. Always run an audit after a site migration, CMS update, major redesign, or any time you notice an unexplained traffic drop in Google Search Console.
The gold standard stack is: Screaming Frog for site crawling, Google Search Console for index and performance data, Ahrefs Site Audit for backlink-aware technical issues, PageSpeed Insights and Lighthouse for Core Web Vitals, and the Rich Results Test for structured data. Screaming Frog has a free version that covers up to 500 URLs — enough for most small business sites.
More than ever. As AI-driven search features (like Google’s AI Overviews) pull content directly from indexed pages, structured data and crawlability become prerequisites for appearing in those results. Sites with technical issues are invisible to both traditional rankings and AI-generated search features.
Basic fixes — resolving indexability errors, fixing redirect chains, submitting a clean sitemap — can show results within weeks as Google recrawls your site. Speed improvements typically show ranking movement within one to three months. Structural changes (site architecture, internal linking overhauls) may take three to six months to fully reflect in rankings.
Conclusion
A strong technical SEO checklist isn’t a one-and-done project — it’s ongoing site hygiene. The 27 items above cover every major area Google evaluates when deciding whether your site deserves to rank: from how easily bots can crawl it, to how fast it loads, to whether your schema markup is valid, to how it performs on a mobile device.
Work through these items systematically, prioritize by impact, and revisit them quarterly. The sites that consistently outrank their competition aren’t always the ones with the best content — they’re the ones whose technical foundation never gives Google a reason to look elsewhere.
If you’d rather have a professional do this right the first time, a full technical SEO audit will map every one of these issues against your specific site, prioritize them by revenue impact, and give your development team a clear remediation plan. That’s the difference between a checklist and a strategy.
Discover more from Web Pivots®
Subscribe to get the latest posts sent to your email.





