All articles

The Five Pillars of SEO, Explained

A practitioner's breakdown of technical, on-page, content, off-page, and local SEO: what each does, why it matters, and how they work together.

SEO Is Not One Discipline

Most people talk about SEO as if it were a single activity. It is not. SEO is a collection of distinct, interdependent practices, each targeting a different layer of how search engines discover, evaluate, and rank web content. Ignore one layer and you create a bottleneck that no amount of effort in the others will fix. A site with brilliant content but a three-second load time will lose to a faster competitor with merely adequate copy. A technically flawless site with no backlink profile will struggle to outrank established authorities.

The five core categories are technical, on-page, content, off-page, and local SEO. Sitting across all of them is Google’s E-E-A-T framework, which functions less as a category and more as a quality lens applied to everything else. Here is what each actually involves at a practical level, and why skipping any one of them is a strategic mistake.

Technical SEO: The Infrastructure Layer

Technical SEO governs whether search engines can physically access, render, and index your pages. None of your other work matters if Googlebot cannot crawl your site or if pages take too long to paint. Think of it as the foundation of a building: invisible to visitors, but the reason everything above it stays standing.

Core Web Vitals

Google measures three specific user experience metrics under the Core Web Vitals umbrella, and each addresses a different dimension of page performance.

Largest Contentful Paint (LCP) tracks how quickly the largest visible element in the viewport finishes loading. This is typically a hero image, a video poster frame, or a large block of text. Google’s threshold is 2.5 seconds for a “good” score. Anything between 2.5 and 4 seconds needs improvement. Anything beyond 4 seconds is rated poor. Common LCP killers include unoptimised images served without modern formats like WebP or AVIF, render-blocking JavaScript that delays the main thread, and slow server response times (TTFB above 800ms).

Interaction to Next Paint (INP) replaced First Input Delay in March 2024 as the responsiveness metric. Where FID only measured the delay of the first interaction, INP measures the latency of all interactions throughout the page lifecycle and reports the worst one. This is a harder bar to clear. A page might respond instantly to the first click but choke when a user interacts with a complex dropdown menu or a dynamically loaded section further down the page. Heavy JavaScript frameworks, long main-thread tasks, and excessive DOM size are the usual culprits. The threshold for a good INP score is 200 milliseconds.

Cumulative Layout Shift (CLS) quantifies visual instability: those frustrating moments when a button moves just as you reach for it, or a paragraph jumps down because an ad loaded above it. The threshold is 0.1. CLS problems almost always stem from images and iframes without explicit width and height attributes, dynamically injected content above the fold, or web fonts that cause a flash of unstyled text (FOUT) as they load. The fix is often straightforward: reserve space for every element before it loads, and use font-display: swap with proper font fallbacks.

These are not vanity metrics. Google confirmed in 2021 that Core Web Vitals are a ranking signal, and the data from real-user monitoring via the Chrome User Experience Report (CrUX) feeds directly into search rankings. You can audit your scores in Google Search Console under the Core Web Vitals report, or use PageSpeed Insights for page-level diagnostics.

Crawlability and Indexation

Your robots.txt file tells crawlers which URL paths to avoid. Your XML sitemap tells them which pages exist, when they were last modified, and how frequently they change. These two files form the basic contract between your site and search engine bots.

Misconfigure either and the consequences cascade. A misplaced Disallow: / in robots.txt blocks your entire site from indexation. A sitemap that includes noindexed pages, redirects, or 404s wastes crawl budget and sends conflicting signals. For smaller sites with a few hundred pages, these issues are minor annoyances. For larger sites with tens of thousands of URLs, crawl budget becomes a genuine engineering concern. Googlebot allocates a finite number of requests per crawl session based on your site’s perceived importance and server capacity. If half those requests are spent on paginated archives, faceted navigation URLs, or session-ID parameters, your most valuable pages may go weeks without being re-crawled.

Canonicalisation is another critical piece. Duplicate or near-duplicate content (www vs. non-www, HTTP vs. HTTPS, trailing slashes, URL parameters) confuses crawlers and dilutes ranking signals. The rel="canonical" tag tells Google which version of a page is the authoritative one. Without it, Google guesses, and it does not always guess correctly.

Rendering is the final consideration. Google renders JavaScript, but it does so in a two-phase process: it crawls the HTML first, then queues the page for rendering. That queue can take days or weeks. Sites built entirely with client-side JavaScript (single-page applications with no server-side rendering) face significant indexation delays. Server-side rendering (SSR) or static site generation (SSG), which is what Astro provides by default, eliminates this problem entirely.

HTTPS and Security

HTTPS has been a confirmed ranking signal since 2014. Beyond rankings, browsers now flag HTTP sites with visible “Not Secure” warnings in the address bar, which craters user trust before they even read your content. The cost of an SSL certificate is effectively zero through services like Let’s Encrypt, and every major hosting provider supports automatic certificate provisioning.

Beyond the certificate itself, technical SEO audits should check for mixed content warnings (HTTP resources loaded on an HTTPS page), proper 301 redirects from HTTP to HTTPS versions, and HSTS headers that prevent protocol downgrade attacks. These are not edge cases; they are baseline hygiene.

Structured Data and Schema Markup

Schema markup, implemented via JSON-LD (Google’s preferred format), lets you explicitly declare what your content represents: a product, a recipe, an FAQ, a local business, a how-to guide, an event. This does not directly boost rankings, but it enables rich results in SERPs. A product listing with star ratings, price, and stock availability visible directly in search results will outperform a plain blue link nearly every time. FAQ schema can surface your content in expandable accordion-style results. How-to schema can display step-by-step instructions directly on the results page.

The practical implementation is straightforward: add a <script type="application/ld+json"> block to the <head> of each page with the appropriate schema type and properties. Google’s Rich Results Test and Schema Markup Validator are the two tools for validating your implementation. The most common mistakes are referencing schema types that Google does not actually support for rich results, or marking up content that does not exist on the visible page (which Google treats as a spam signal).

On-Page SEO: Precision at the Page Level

On-page SEO is the practice of optimising individual pages so that both users and crawlers can immediately understand what the page is about and why it deserves to rank. Where technical SEO deals with site-wide infrastructure, on-page SEO operates at the granularity of a single URL.

Title Tags and Meta Descriptions

The title tag remains one of the strongest on-page ranking signals. Keep it under 60 characters to avoid truncation in SERPs, front-load the primary keyword, and make it specific. “Best Running Shoes for Flat Feet (2026 Guide)” outperforms “Running Shoes” because it addresses a concrete query with a clear modifier and a freshness indicator. Avoid keyword stuffing in title tags; Google will often rewrite titles it considers misleading or spammy, and you lose control of how your page appears in results.

Meta descriptions do not directly affect rankings. Google has said this explicitly and repeatedly. However, they control the snippet text displayed beneath your title in search results. A well-crafted description with a clear value proposition and a soft call to action can double your click-through rate compared to whatever Google auto-generates from on-page content. Keep it under 155 characters. Write it as a pitch, not a summary.

Header Hierarchy and Semantic Structure

Your H1 tag should appear exactly once per page and clearly state the page’s primary topic. H2 tags define major sections. H3 tags break those sections into subsections. This hierarchy is not just a stylistic choice: it gives crawlers a semantic map of your content. Screen readers also rely on header hierarchy for accessibility, which means proper heading structure serves both SEO and WCAG compliance simultaneously.

A common mistake is using header tags for visual styling rather than semantic meaning. If you want large, bold text that is not a section heading, use CSS. Misusing H2 or H3 tags for non-heading content muddies the semantic structure and confuses crawlers about your page’s topical organisation.

Internal Linking Architecture

Internal links distribute PageRank (link equity) across your site and establish topical relationships between pages. A strong internal linking structure accomplishes two things: it helps crawlers discover new pages faster, and it signals to Google which pages you consider most authoritative. The pages with the most internal links pointing to them receive the most equity, which is why homepages tend to rank well even for competitive terms.

The common mistake is treating internal linking as an afterthought, something you tack on when you remember. In practice, it should be as deliberate as your site’s navigation. Orphan pages (those with zero internal links pointing to them) are effectively invisible to crawlers that rely on link-following rather than sitemap-only discovery. Topic clusters, where a central “pillar” page links to and from a set of related subtopic pages, remain one of the most effective internal linking strategies for building topical authority.

Anchor text matters here too. Internal link anchor text should be descriptive and keyword-relevant, not generic phrases like “click here” or “read more.” Google uses anchor text to understand what the linked page is about.

Search Intent Alignment

This is where many SEO strategies fail. Google’s algorithms have become remarkably good at classifying the intent behind a query, and they reward pages that match that intent precisely.

The four primary intent types are informational (the user wants to learn something), navigational (the user wants to reach a specific site or page), transactional (the user wants to buy or take a specific action), and commercial investigation (the user is comparing options before committing). If your page targets a transactional keyword like “buy standing desk” but delivers a 2,000-word informational essay about the history of standing desks, you will not rank. Full stop.

Before writing a single word of content, search your target keyword and study the SERP. Look at the format of the top results. Are they listicles? Product pages? Video tutorials? Long-form guides? That tells you exactly what intent Google has assigned to that query. Your page needs to match that format and satisfy that intent better than the current results. Trying to rank an informational blog post for a query where Google is showing product carousels and shopping ads is a misallocation of effort.

Content SEO: What You Publish and How

Content SEO is where keyword research, editorial strategy, and production quality converge. The content itself is what users came for. Everything else, the technical infrastructure, the on-page signals, the backlinks, exists to deliver that content to the right audience at the right time.

Keyword Research and Targeting

Keyword research is not about finding the highest-volume term and stuffing it into a page. It is about identifying the specific terms your target audience uses at each stage of their journey, then building content that satisfies those queries better than the current top results.

The process starts with seed keywords derived from your product, service, or topic area. From there, tools like Ahrefs, Semrush, or Google’s Keyword Planner expand those seeds into clusters of related terms, each with data on monthly search volume, keyword difficulty, and click-through potential. Pay close attention to the click data: many high-volume keywords produce zero clicks because Google answers the query directly in a featured snippet or knowledge panel.

Long-tail keywords (three or more words, lower volume, higher specificity) convert at significantly higher rates than head terms because the intent is clearer. A user searching “best project management tool for remote teams under 50 people” knows exactly what they want. That specificity makes the content easier to write and the conversion path shorter.

Google Search Console is an underused resource here. The Performance report shows which queries are driving impressions but not clicks. That gap, high impressions with a low click-through rate, represents your most immediate opportunity: pages that Google already considers relevant but that users are not choosing. Often, a better title tag or a more compelling meta description is enough to close that gap.

Content Freshness and Decay

Google’s freshness algorithm, introduced with the Caffeine update in 2010 and refined continuously since, gives a ranking boost to recently updated content for queries where recency matters. This is query-dependent: “how to tie a bowline knot” does not need freshness, but “best laptops” absolutely does. A laptops page from 2023 will lose ground to an equivalent page updated in 2026, because Google knows the product landscape has shifted.

Content decay is the natural process by which pages lose rankings over time as competitors publish newer, better content and the information on your page becomes outdated. The solution is not rewriting everything monthly. It is building an audit cadence into your editorial workflow. Quarterly, review your top-performing pages: update statistics, remove dead links, replace outdated screenshots, and add new sections where the topic has evolved. Google’s crawlers notice when a page’s last-modified date changes and the content has materially shifted, and they re-evaluate the page accordingly.

Depth, Completeness, and the Helpful Content System

Thin content rarely ranks for competitive terms. Google’s helpful content system, initially rolled out in August 2022 and updated multiple times since, specifically targets pages that exist purely for search engines rather than for users. The system operates as a site-wide signal: if a significant portion of your site is deemed unhelpful, it can drag down the rankings of your genuinely useful pages too.

The standard is straightforward: does this page give the reader everything they need, or will they hit the back button and try another result? That “pogo-sticking” behaviour (clicking a result, bouncing back, clicking the next result) is a strong negative signal.

Comprehensive does not mean long. A 600-word page that directly and completely answers a narrow question can outrank a 3,000-word article that buries the answer in filler. The goal is completeness relative to the query, not word count for its own sake. Cover all the subtopics a searcher would reasonably expect, link out to authoritative sources where appropriate, and stop when you have said what needs to be said.

Off-Page SEO: Signals from the Wider Web

Off-page SEO encompasses every signal that originates outside your own site. The dominant factor here is backlinks, but brand mentions, social signals, and digital PR all contribute to how search engines perceive your authority and trustworthiness.

A backlink is a vote of confidence from one site to another. Not all votes are equal. Google’s algorithms evaluate links based on several factors: the linking site’s own authority and topical relevance, the placement of the link within the page (editorial links within body content carry more weight than footer or sidebar links), the anchor text used, whether the link is followed or carries a rel="nofollow" attribute, and the overall link velocity (the rate at which new links are acquired).

The shift over the past decade has been decisively away from link quantity and toward link quality. One editorial link from a domain with genuine authority in your niche, a trade publication, a university research page, a respected industry blog, is worth more than a hundred links from low-quality guest post farms or generic directories. Google’s algorithms can now assess the topical relevance of the linking page to the linked page, which means a link from a site in your industry carries more weight than a link from an unrelated domain with the same authority score.

Link building as a practice has matured accordingly. The most effective strategies today are content-driven: publishing original research that others cite, creating tools or calculators that earn natural links, and building genuine relationships with journalists and editors through expert commentary and data-backed insights.

Brand Mentions and Digital PR

Google’s patents reference the concept of “implied links,” where a brand mention without a hyperlink still functions as a trust signal. When your brand appears in news articles, industry reports, podcast show notes, or expert roundups, it builds topical association even without a direct link. This is why digital PR has become a core component of off-page strategy. A well-placed media mention generates both: the link itself carries authority, and the brand mention reinforces your presence in Google’s entity graph.

The practical execution of digital PR for SEO involves identifying journalists and publications that cover your industry, monitoring for relevant source requests (platforms like HARO, Qwoted, and Connectively facilitate this), and proactively pitching data, expert commentary, or original research that editors find genuinely useful. The key distinction from traditional PR is the focus on earning links from high-authority domains, not just coverage for brand awareness.

What Does Not Work Anymore

Link schemes, paid links disguised as editorial content, private blog networks (PBNs), mass directory submissions, and reciprocal link exchanges are all explicitly against Google’s spam policies. The Penguin algorithm, which has been part of Google’s core algorithm since 2016, detects and devalues manipulative link patterns in real time. More recently, Google’s December 2022 link spam update incorporated SpamBrain (their AI-based spam detection system) to identify and neutralise purchased links and link networks at scale.

The risk-reward calculation on these tactics is no longer favourable. A manual penalty from Google’s webspam team can remove your site from search results entirely, and recovery is a months-long process with no guaranteed outcome.

Local SEO: Ranking in the Map Pack

Local SEO applies to any business that serves a geographic area, whether that is a single storefront, a service area business, or a multi-location franchise. The goal is to appear in Google’s local pack (the map results with three listings that appear above organic results for location-based queries) and in Google Maps.

Google Business Profile

Your Google Business Profile (GBP) is the single most influential factor in local pack rankings. It needs to be fully completed: business name (exactly as it appears on your signage, no keyword stuffing), address, phone number, hours of operation, primary and secondary categories, a detailed business description, and high-quality photos of your premises, products, and team.

Regular activity on your GBP signals to Google that the business is active and engaged. This means posting updates (Google Posts), responding to every review (positive and negative), adding new photos on a regular cadence, and answering questions in the Q&A section. Businesses that treat their GBP as a static listing rather than a living profile consistently underperform in local rankings.

Categories deserve special attention. Your primary category has the strongest influence on which queries trigger your listing. Secondary categories expand your visibility for related searches. Choosing the wrong primary category, or selecting too broad a category when a more specific one exists, is one of the most common and easily fixable local SEO mistakes.

NAP Consistency

NAP stands for Name, Address, Phone number. These three data points must be identical across every directory, citation source, social profile, and web page where your business appears. Inconsistencies create ambiguity for Google’s local algorithm. “123 Main St” on your website, “123 Main Street” on Yelp, and “123 Main St, Suite 4” on the Better Business Bureau are three different signals, and Google cannot confidently reconcile them.

This problem compounds for businesses that have moved, rebranded, or changed phone numbers. Old listings with outdated information persist on directories for years, and they actively harm your local rankings. Tools like BrightLocal, Moz Local, or Whitespark can audit your citation landscape, identify inconsistencies, and submit corrections at scale.

Local Citations and Reviews

Citations are mentions of your business on other websites: directories like Yelp and Yellow Pages, industry-specific listings (TripAdvisor for hospitality, Avvo for legal), local chamber of commerce pages, and community blogs. The volume, accuracy, and quality of citations remains a ranking factor for local search, though its relative importance has decreased as Google has become better at verifying business information through other signals.

Reviews are where the real leverage is. Google reviews influence local pack rankings through three dimensions: star rating, review volume, and recency. A business with 200 reviews and a 4.6 rating will typically outrank a competitor with 15 reviews and a 4.9 rating, because Google trusts the larger sample size. Recency matters too: a business whose last review is six months old signals lower engagement than one receiving reviews weekly.

The operational playbook is simple but requires discipline. Ask every satisfied customer for a review. Make it easy by providing a direct link to your Google review form. Respond to every review, especially negative ones, with professionalism and specificity. Google has stated that review responses signal business engagement, and users read them when deciding between competitors.

E-E-A-T: The Quality Overlay

E-E-A-T stands for Experience, Expertise, Authoritativeness, and Trustworthiness. It is not a ranking algorithm or a score that Google computes. It is a set of criteria from Google’s Search Quality Rater Guidelines that human quality raters use to evaluate search results. Those evaluations inform the machine learning models that drive Google’s ranking algorithms. Understanding E-E-A-T is understanding what Google is training its systems to reward.

Experience means the content creator has first-hand involvement with the subject. A product review written by someone who actually purchased and used the product carries more weight than one assembled from manufacturer specs and other reviews. Google looks for signals like original photography, specific details that only a user would know, and personal anecdotes that demonstrate real-world interaction with the topic.

Expertise refers to the creator’s qualifications. For “Your Money or Your Life” (YMYL) topics, which include health, finance, legal advice, safety, and news, Google expects content authored or reviewed by credentialed professionals. A medical article reviewed by a licensed physician and carrying their byline sends a stronger expertise signal than the same content published anonymously. For non-YMYL topics, “everyday expertise” suffices: a hobbyist photographer writing about camera settings has adequate expertise for that context.

Authoritativeness is about recognition within your field. Does this person or brand get cited by others? Do authoritative publications link to or reference this content? Are they invited to speak at industry events, quoted in news articles, or referenced in academic work? Authoritativeness is built over time through consistently publishing quality content and earning recognition from peers.

Trust is the foundation of the entire framework. Google’s Search Quality Rater Guidelines explicitly state that trust is the most important E-E-A-T factor. A site with deceptive practices (fake reviews, hidden affiliate relationships, misleading headlines), missing contact information, no clear editorial policy, or a history of publishing inaccurate information will fail the trust test regardless of how strong the other signals are. For e-commerce sites, trust signals include clear return policies, visible customer service contact information, secure payment processing, and transparent pricing.

Practically, implementing E-E-A-T means adding detailed author bios with credentials and links to professional profiles, displaying clear editorial standards, citing authoritative sources, keeping content accurate and updated, maintaining transparent business information, and ensuring your site’s security and privacy practices are beyond reproach.

The sites that consistently win in organic search are the ones that treat SEO not as a marketing tactic, but as a product discipline: every page engineered to serve both the user and the crawler, with no shortcuts on either side.

Making It Operational

These five categories are not a checklist to complete once. They are ongoing operational concerns that require different cadences and different skill sets.

Technical SEO needs quarterly audits at minimum, with continuous monitoring of Core Web Vitals and crawl errors through Google Search Console. On-page SEO should be baked into your content creation workflow: every page published should have an optimised title, proper heading structure, deliberate internal links, and a clear intent match. Content SEO requires an editorial calendar with planned audits of existing content alongside new production. Off-page SEO demands consistent outreach and relationship building, not sporadic bursts of link acquisition. Local SEO, for businesses that need it, requires weekly GBP management and an ongoing review generation process.

The sites that maintain strong organic visibility over years, not months, are the ones that treat each of these layers as a permanent operational function. Start with the layer that is weakest, fix the fundamentals, then compound your gains across the others. There are no shortcuts here, but the compounding returns from sustained, disciplined execution are substantial.