The Complete Content Audit Guide

Content Decay Analysis: A Data-Driven Framework

Content decay has four distinct patterns, each with different causes and different fixes. This framework uses GA4 + GSC data to detect, diagnose, and act on each one.

Published April 15, 2026
10 min read

Content decay analysis: a data-driven framework

Content decay is not a mystery. It's a measurable phenomenon with identifiable patterns, diagnosable causes, and predictable fixes. The problem isn't that people don't know content decays — everyone who's watched a blog post slowly lose traffic over 18 months understands decay intuitively. The problem is that most teams detect decay too late, diagnose it too vaguely ("the content is outdated"), and respond too generically ("let's refresh it").

This guide provides a systematic analytical methodology for content decay. It defines four distinct decay patterns, explains how to detect each using GA4 and GSC data in combination, and maps each pattern to a specific response. If you've read ten articles that explain what content decay "is," this is the one that tells you what to do about it with your actual data.

The methodology assumes you have GA4 and GSC connected to your site with at least 6 months of data. If you're working with less data, the trend analysis will be noisy — wait for the data to accumulate before drawing conclusions.

Content decay, defined briefly

Content decay is the gradual or sudden decline in a page's organic performance over time. Performance means traffic (GA4 sessions), visibility (GSC impressions), and rankings (GSC average position). A page that ranked position 3 and received 500 organic sessions per month in January but now ranks position 11 and receives 80 sessions is decaying.

Decay is different from content that never performed. A page that has received zero traffic since publication isn't decaying — it never worked. Decay applies to content that once performed and has since declined.

That's enough definition. The rest of this guide is methodology.

The four decay patterns

Not all decay looks the same. Treating it as one phenomenon leads to one-size-fits-all responses ("update the content") that often miss the actual problem. These four patterns cover the vast majority of content decay in the wild.

Pattern 1: Gradual erosion

What it looks like. A slow, steady decline over 6-18 months. No single month shows a dramatic drop. Each month is slightly worse than the previous one. It's easy to miss in monthly reporting because the month-over-month change is small, but the quarter-over-quarter change is significant.

What causes it. Competitors publish fresher, more comprehensive content. The SERP evolves — Google adds new SERP features, the intent interpretation shifts, or the ranking algorithm weighs freshness more heavily. The page's information becomes incrementally less current without becoming factually wrong.

How to detect it with data. Compare rolling 90-day traffic averages across three consecutive periods. If each period shows a decline of 10% or more from the prior period, and GSC average position is drifting upward (worsening), you're looking at gradual erosion.

  • GA4 signal: Organic sessions trending down 10-30% quarter-over-quarter
  • GSC signal: Average position worsening by 2-5 positions over 6 months
  • GSC impressions: Stable or slightly declining (the queries still exist; you're just ranking lower)

The fix. Content refresh. Update the page with current data, expanded sections, better structure, and fresher examples. Don't rewrite from scratch — the page's URL equity is valuable. Update in place. Check what the current top-3 ranking pages include that yours doesn't, and close the gaps.

Pattern 2: The cliff

What it looks like. A sudden, sharp drop — 50% or more of traffic disappearing within a 1-2 week window. Unmistakable in any reporting view.

What causes it. Google algorithm updates (core updates, helpful content updates, spam updates). A site-wide penalty or quality demotion. A technical change that broke the page's indexability (accidental noindex, canonical change, redirect). A competitor publishing a definitively better resource that captures the featured snippet or top position.

How to detect it with data. Compare week-over-week traffic in GA4. A drop of 40%+ in a single week that isn't explained by seasonality is a cliff event. Cross-reference the timing with Google's published algorithm update timeline.

  • GA4 signal: Abrupt traffic drop of 40%+ within 1-2 weeks
  • GSC signal: Position change of 5+ places in the same timeframe
  • GSC impressions: May remain stable (if it's a ranking drop) or drop sharply (if the query landscape changed)

The fix. Depends on the cause. For algorithm updates, analyze what changed in the SERP — did the intent interpretation shift? Did Google start favoring a different content format? For technical changes, check indexation status and crawl data. For competitive displacement, study the page that replaced you and identify what it does better. Cliff events often require structural changes (new sections, different angle, format change), not just content refreshes.

Pattern 3: Seasonal decay

What it looks like. Traffic follows a recurring annual pattern — peaks during certain months, troughs during others. The page appears to be "decaying" during off-peak months but recovers when the seasonal cycle repeats.

What causes it. The underlying search query has seasonal demand. "Tax filing checklist" peaks in January-April. "Summer marketing campaign ideas" peaks in May-July. "Black Friday deals" peaks in November. The content itself hasn't changed — demand has.

How to detect it with data. Compare the same month year-over-year rather than month-over-month. If January 2026 traffic is roughly equal to January 2025 traffic, but July 2025 was much higher, you're seeing seasonality, not decay.

  • GA4 signal: Traffic follows a repeating annual curve when viewed over 12+ months
  • GSC signal: Impressions follow the same seasonal curve (proving the query demand is seasonal)
  • Google Trends: The keyword shows a recurring annual pattern

The fix. Usually none. Seasonal content performing at similar levels year-over-year isn't decaying. The mistake is refreshing seasonal content during its off-peak period and interpreting the continued low traffic as "the refresh didn't work." Update seasonal content 4-6 weeks before its peak period, not during the trough. If year-over-year performance during the peak period is declining, then you have genuine decay overlaid on seasonality — apply Pattern 1 or Pattern 2 analysis to the peak-period data.

Pattern 4: Algorithmic displacement

What it looks like. Traffic doesn't decline gradually or suddenly — it fluctuates. The page bounces between position 4 and position 14, between 300 sessions/week and 40 sessions/week, with no clear trend direction. The volatility itself is the signal.

What causes it. Google is testing different SERP configurations. The page is on the boundary between ranking well and not ranking at all. Small algorithm adjustments push it above or below the fold. The page may also be competing with another page on your own site (cannibalization) and Google is alternating which one it ranks.

How to detect it with data. Plot daily GSC position data for the page's primary keyword. If the position oscillates by 5+ places on a weekly basis, you're seeing algorithmic displacement.

  • GA4 signal: High variance in weekly organic sessions (standard deviation > 50% of the mean)
  • GSC signal: Position oscillating across 5+ places within a single month
  • GSC click-through rate: Volatile, because CTR changes dramatically between position 4 and position 14

The fix. Strengthen the page's authority and relevance signals so Google commits to the higher position. Add depth, improve internal linking, earn external links, and check for cannibalization with other pages on your site. If another page on your site is competing for the same query, consolidate — one strong page ranks better than two mediocre ones.

How to build the detection system

Detecting decay requires comparing data across time periods. Here's how to set it up.

Step 1: Define your baseline period

Choose a 90-day window when the page was performing at or near its peak. This becomes the benchmark against which you measure decline. For most content, the baseline period is 3-6 months after publication, when the page has settled into its natural ranking position.

Step 2: Set decline thresholds

Not every traffic fluctuation is decay. Define thresholds that trigger investigation:

  • Yellow alert (investigate): 20% decline in organic sessions compared to baseline, sustained for 30+ days
  • Red alert (act): 40% decline in organic sessions compared to baseline, sustained for 30+ days
  • Cliff alert (urgent): 50%+ decline within a 14-day window

These thresholds filter out normal variance and surface genuine decay patterns.

Step 3: Blend GA4 and GSC data

The most revealing analysis comes from correlating GA4 and GSC data at the page level. Traffic can decline for two different reasons that require different fixes:

  • Fewer impressions (GSC impressions down): Google is showing your page to fewer people. The query landscape may have changed, or you may have been removed from queries you used to rank for. This is a visibility problem.

  • Lower CTR (GSC impressions stable, clicks down): Google is still showing your page, but fewer people are clicking. Your SERP snippet (title, description) may be less compelling than competitors'. This is a presentation problem.

  • Lower position (GSC position worsening): Competitors have overtaken you. This is a relevance or authority problem.

Each diagnosis points to a different fix. Impressions decline → investigate query changes and indexation. CTR decline → update title tag and meta description. Position decline → refresh content and build links.

Evergreen's GA4 and GSC integrations merge this data automatically at the page level, making the blended analysis available in the content audit table without spreadsheet merges.

Step 4: Prioritize by traffic impact

Not all decaying pages deserve equal attention. Prioritize by the absolute traffic at stake:

  • A page declining from 5,000 sessions/month to 3,000 sessions/month is a higher priority than a page declining from 50 to 30 — even though the percentage decline is similar.
  • A page targeting a high-intent keyword (product comparison, buying guide) matters more per session than a page targeting an informational keyword, because intent correlates with conversion probability.

Sort your decaying pages by estimated traffic loss (baseline sessions minus current sessions) and work from the top.

The decision framework: refresh, retire, or consolidate

Every decaying page gets one of three responses.

Refresh

When to use it. The page's topic is still relevant. The search query still has volume. The page has earned links or has strong internal link equity. Competitors have newer, better content on the same topic.

What to do. Update the content with current data, examples, and structure. Don't change the URL. Don't change the publication date (update the "last modified" date instead). Add new sections that cover gaps in the current version. Remove sections that are no longer relevant. Improve the opening — if the first two paragraphs feel dated, the reader won't trust the rest.

Expected timeline. 2-8 weeks for ranking recovery after a meaningful refresh. This varies by keyword competitiveness and the extent of the update.

Retire

When to use it. The topic is no longer relevant to your audience. The search query has dried up (volume dropped to near zero). The content doesn't align with your current positioning.

What to do. 301 redirect to the most relevant remaining page. If no relevant page exists, redirect to the parent category or a related guide. Do not leave the page live — a low-quality page that ranks poorly can drag down the site's overall quality signals.

Consolidate

When to use it. Multiple pages on your site cover similar topics and compete for similar queries. GSC shows the same keywords triggering multiple URLs from your site.

What to do. Choose the strongest URL (most links, highest historical traffic, best position). Merge the best content from all competing pages into that URL. 301 redirect the other URLs to the consolidated page. The result is one comprehensive page with combined link equity instead of three thin pages splitting it.

For a broader view of how consolidation fits into the full content audit process, see the content audit template.

How Evergreen surfaces content decay

The content audit table in Evergreen shows traffic trends directly alongside content attributes. With GA4 and GSC connected, every page in the audit table displays its traffic trajectory — growing, stable, or declining — based on rolling 90-day comparisons.

Sorting the audit table by traffic decline over 90 days puts your most rapidly decaying pages at the top. Filtering to pages with declining traffic and no update in the past 6 months isolates the highest-priority refresh candidates. The blended GA4 + GSC view shows whether the decline is driven by impressions, position, or CTR — so you know whether the fix is content, technical, or presentational.

For agencies managing multiple client sites, the same analysis runs across every project in the dashboard. Decay detection across 10 client sites takes minutes instead of the hours required to pull, merge, and analyze data in separate spreadsheets for each client.

Detect decay before it costs you traffic → Start free

Related Topics in The Complete Content Audit Guide

The Only Content Audit Template You'll Ever Need

A ready-to-use content audit template with scoring frameworks, action categories, and step-by-step instructions for auditing any website.

What is a Content Gap Analysis? (And How to Do One)

Learn what content gap analysis is, why it matters for SEO, and how to identify missing content opportunities on your website.

Content Audit Without Spreadsheets: A Visual Approach

Spreadsheets are where content audits go to die. Here's how to run a content audit visually — with sortable views, filters, and a sitemap overlay instead of row 847.

How to Find Pages Missing Meta Descriptions (Site-Wide)

Find every page on your site that's missing a meta description — and decide which ones actually need one. A short, practical guide with a site-wide auditing workflow.

Bulk Meta Description Checker: Audit Every Page at Once

Checking meta descriptions one page at a time doesn't scale past a dozen URLs. Here's how to audit every meta description on your site in a single pass — catching missing, duplicate, and truncated descriptions before they cost you clicks.

Combine GA4 + Search Console Data for Page-Level Insights

GA4 tells you what visitors do. Search Console tells you how they found you. Combining them per page — without Looker Studio or Python — is how you find the pages worth fixing.

Content Audit Template 2026 (No Spreadsheet Required)

A complete content audit template updated for 2026 — covering AI content, LLM visibility, and content decay. Use the downloadable spreadsheet or skip it entirely with a visual alternative.

Content Audit Checklist Built for Agencies

A 20-point content audit checklist designed for agencies managing multiple client sites. Covers discovery, audit execution, deliverable creation, and ongoing monitoring — with pricing math.

How Often Should You Run a Content Audit?

The answer is continuously, with monthly reviews. Here's why the annual audit cycle is obsolete, when event-driven audits are necessary, and how continuous monitoring replaces the spreadsheet ritual.