⚙️ Day 4 of 7

Technical SEO

Make sure Google can actually crawl, access, and understand your website — because great content means nothing if Google can't reach it.

📚 5 lessons
🤖 3 AI prompts
🧠 5-question quiz
1
Fundamentals
2
Keywords
3
On-Page
4
Technical
5
Content
6
Off-Page
7
Audit
💡

Today's Big Idea

Technical SEO is the foundation everything else rests on. If Google can't crawl your site, your keywords and content don't matter. Think of it like building a house — technical SEO is the plumbing and electrical. You don't see it, but without it, nothing works.

Most small business websites have basic technical issues that hold them back. Broken links, slow loading times, mobile problems. Today you'll audit and fix them.

🔧

Core Concepts: Technical SEO Foundations

Site Speed & Core Web Vitals

Google's Core Web Vitals measure how fast and stable your pages feel. These directly impact rankings. The three metrics are:

  • LCP (Largest Contentful Paint): How long until the main content loads. Target: under 2.5 seconds. Common causes of slow LCP: unoptimized images, slow server, too many third-party scripts.
  • FID/INP (Interaction to Next Paint): How fast your page responds when someone clicks or types. Target: under 200ms. Usually caused by heavy JavaScript running in the background.
  • CLS (Cumulative Layout Shift): How much content jumps around while loading. Target: under 0.1. Common cause: images without height/width attributes, ads/embeds loading dynamically.

How to check: Use PageSpeed Insights (pagespeed.web.dev). Enter your URL and get your score (aim for 90+). It shows you the exact issues.

Mobile-First Indexing

Google primarily uses the mobile version of your site to determine rankings for all users — even desktop users. If your mobile site is missing content or features, you'll lose rankings.

What to check:

  • Does your mobile site show the same content as desktop?
  • Are images loading properly on mobile?
  • Is text readable without zooming?
  • Are buttons large enough to tap easily?

Test with Google's Mobile-Friendly Test tool or just view your site on a phone.

HTTPS & SSL Certificate

All websites should use HTTPS (the "S" = Secure). If your site shows "Not Secure" warning, it directly hurts rankings and scares users away.

Good news: Most hosting providers offer free SSL certificates via Let's Encrypt. If you're on WordPress, Shopify, Wix, etc., it's usually free and automatic. Check your browser — you should see a padlock icon next to your URL.

XML Sitemap

A sitemap.xml file tells Google all the pages on your site. It's like giving Google a map instead of making them find pages one by one.

  • For WordPress: Use Yoast SEO or RankMath plugin — they auto-generate sitemaps
  • For Shopify: Automatically created at yourdomain.com/sitemap.xml
  • For custom sites: Use a free tool like XML-Sitemaps.com to generate
  • Submit to Google: Go to Google Search Console, submit your sitemap.xml URL

Robots.txt File

The robots.txt file (at yourdomain.com/robots.txt) tells crawlers which pages to crawl and which to skip. Common use: block admin pages, login pages, or duplicate pages from being crawled.

Be careful: Never accidentally block important pages. Many sites accidentally block /blog or /products in robots.txt and wonder why they don't rank.

Canonical Tags

Duplicate content is a problem. If the same page exists at multiple URLs (e.g., yourdomain.com/page and yourdomain.com/page/), Google gets confused.

Use a canonical tag to tell Google which version is the "official" one:

<link rel="canonical" href="https://yourdomain.com/preferred-url">

Put this in the <head> of duplicate pages. Google will consolidate ranking signals to the canonical URL.

Broken Links (404 Errors)

Broken links hurt user experience and waste Google's crawl budget. Find them using Google Search Console:

  • Go to Google Search Console → Coverage tab
  • Look for pages with "Error" status
  • Fix by either: updating the link to work, or creating a 301 redirect to a working page

Schema Markup (Structured Data)

Schema markup is code that tells Google detailed information about your content. This can trigger "rich results" in search results — like star ratings, FAQ snippets, or event information.

Example: A recipe page with schema markup shows stars, cooking time, and ingredients in the search result. Users can preview before clicking. Higher click-through rate = better rankings.

Types of schema: Article, Product, Recipe, FAQ, LocalBusiness, JobPosting, Event, etc.

Use Schema.org to learn the format. Most platforms (WordPress, Shopify) have plugins that auto-generate schema.

Page Indexability

Just because you built a page doesn't mean Google indexed it. Check what's indexed:

  • Type site:yourdomain.com into Google
  • Count the results
  • Compare to your actual page count
  • If numbers don't match, pages are missing from index

Common reasons for non-indexation: robots.txt blocking, noindex tag, nofollow links, new site (give Google time to crawl).

Crawl Budget

Google has a limited budget for crawling your site. Large sites with thousands of pages need to be strategic. Tips:

  • Remove duplicate pages
  • Delete thin or low-value pages
  • Fix broken links (Google wastes crawl budget on 404s)
  • Improve site speed (faster sites get crawled more)
🤖

Prompt Lab: 3 AI Prompts for Technical SEO

Prompt #1: Explain a technical SEO error

I found this error in Google Search Console: '[PASTE THE EXACT ERROR MESSAGE]'. My website is built on [PLATFORM — e.g., WordPress, Shopify, Squarespace]. Explain: 1) What this error means in plain English, 2) Why it matters for SEO, 3) Step-by-step instructions to fix it on [PLATFORM], 4) How to verify the fix worked.

Paste any Google Search Console error for an instant fix guide

Prompt #2: Create a robots.txt file

Create a robots.txt file for a [PLATFORM — WordPress/Shopify/etc.] website. The site has: [DESCRIBE YOUR SITE STRUCTURE — e.g., 'a blog, a shop section, and a members area I don't want indexed']. Make sure to: allow Googlebot access to important pages, block admin and login pages, reference the sitemap location, and explain each line with a comment.

Get a ready-to-use robots.txt file for your site

Prompt #3: Technical SEO audit checklist for my site

Generate a complete technical SEO audit checklist for a [TYPE OF WEBSITE — e.g., 'small business service site with 20 pages']. Organize it by priority (critical / important / nice to have). For each item, include: what to check, the free tool to use, and what to do if there's a problem. Focus on issues that affect Google's ability to crawl and rank the site.

Get a custom audit checklist tailored to your site type

Practice Task

⚡ Today's Action

Run a Technical Health Check

Use these 3 free tools on your website today:

  • Google PageSpeed Insights (pagespeed.web.dev) — Check your Core Web Vitals score. Aim for 90+ on mobile. Note your LCP, INP, and CLS scores.
  • Google Search Console → Coverage tab — Look for any "Error" pages and make a list of them
  • Site command: Type "site:yourdomain.com" in Google — How many pages are indexed? Does it match your expectation? Document the number.

Then use Prompt #3 to get a full technical audit checklist tailored to your site type.

⚠️

Common Beginner Mistakes

Technical SEO mistakes are particularly painful because they're invisible - your content looks fine to you, but Google is quietly blocked or confused. These are the most common ones beginners make.

🚫
Accidentally blocking pages in robots.txt

A misplaced Disallow: / in robots.txt blocks Google from crawling your entire site. This is one of the most catastrophic beginner errors and it can happen silently during a CMS update or plugin install. Always use Google Search Console to verify your key pages are crawlable.

🚫
Not submitting a sitemap to Google

Without a sitemap, Google finds your pages by following links. New pages with few or no internal links pointing to them may never get discovered. Submitting your sitemap.xml in Google Search Console is a 5-minute task that tells Google exactly what you have and ensures nothing gets missed.

🚫
Running HTTP instead of HTTPS

Google confirmed HTTPS as a ranking signal in 2014. Sites still on HTTP in 2026 show a "Not Secure" warning in Chrome, which tanks click-through rates from search results in addition to hurting rankings. Free SSL certificates via Let's Encrypt are available on virtually every host.

🚫
Ignoring Core Web Vitals until traffic disappears

Most beginners only check PageSpeed after their rankings drop. By then the damage is done. Run PageSpeed Insights on your key pages today and fix the top 2-3 recommendations immediately. A 5-second load time on mobile actively suppresses your rankings against faster competitors.

🚫
Leaving duplicate pages without canonical tags

E-commerce sites are especially prone to this: the same product page accessible at /products/shoes and /products/shoes?color=red looks like duplicate content to Google. Without canonical tags, you split your ranking authority across URLs and Google may pick the wrong version to index. Most CMS platforms handle this automatically - verify yours does.

🧠

Day 4 Quiz

5 questions · Instant feedback · Pass at 60% to unlock your Day 4 badge

📋

Day 4 Checklist

Click each item to mark it done. Your progress is saved automatically.

0/5 tasks done

  • ✅ Ran PageSpeed Insights on my homepage (noted the score)
  • ✅ Checked Google Search Console for any coverage errors
  • ✅ Verified my site has HTTPS (padlock in browser)
  • ✅ Confirmed my sitemap exists and is submitted in GSC
  • ✅ Passed the Day 4 quiz