Combine GA4 + Search Console Data for Page-Level Insights
GA4 tells you what visitors do. Search Console tells you how they found you. Combining them per page — without Looker Studio or Python — is how you find the pages worth fixing.
Combine GA4 + Search Console data for page-level insights
GA4 tells you what people do on your site. Search Console tells you how they found it. Neither tells the full story alone.
A page with 5,000 GSC impressions and 200 clicks but a 90% bounce rate in GA4 is a page that wins the click and loses the visitor. A page with 50 sessions per month in GA4 but zero GSC impressions is getting all its traffic from non-search channels — which changes how you think about its SEO potential entirely. You need both datasets, joined at the page level, to make decisions that account for the full lifecycle from impression to engagement.
The problem is that combining them has historically been annoying. GA4 and Search Console use different URL formats, different date granularities, and different attribution models. The standard approach involves exporting CSVs from both, wrestling with VLOOKUP in a spreadsheet, or building a Looker Studio dashboard that breaks whenever Google changes its API schema.
This guide covers why page-level data blending matters, the traditional way to do it, and the path that skips the manual work entirely.
Why GA4 and Search Console data belong together
Each data source answers a different question. Alone, each answer is incomplete.
What Search Console tells you
Search Console reports on the discovery phase — the moment between a user's query and their decision to click (or not). For each page, you get:
- Impressions: How often the page appeared in search results
- Clicks: How often someone chose your result
- CTR: The ratio of clicks to impressions
- Average position: Where you ranked (averaged over the reporting period)
This tells you which pages are visible in search and how compelling your titles and descriptions are. It does not tell you what happens after the click.
What GA4 tells you
GA4 reports on the engagement phase — what users do once they arrive. For each page, you get:
- Sessions: How many visits the page received (from all channels, not just organic)
- Engaged sessions: Sessions lasting over 10 seconds or with a conversion event
- Engagement rate: The percentage of sessions that qualified as engaged
- Average engagement time: How long visitors actually spent
- Conversions: Events you've defined as goals (signups, downloads, purchases)
This tells you whether people find value after arriving. It does not tell you how they found the page or how visible it is in search.
The insight gap
The useful questions live at the intersection:
- High impressions, low clicks, high engagement rate: Your page ranks but your title tag and meta description are weak. The content is good — the SERP snippet is the bottleneck.
- High clicks, low engagement rate: Your SERP snippet overpromises. People click expecting one thing and find another. Rewrite the content to match the query intent or rewrite the meta description to set accurate expectations.
- Low impressions, high engagement rate: The page converts well but has no search visibility. It's a candidate for SEO investment — better keyword targeting, internal linking, or content expansion.
- Declining impressions + declining engagement: Content decay. The page is losing search visibility and the content is no longer resonating. Time to refresh or consolidate.
- High impressions, high clicks, zero conversions: The page attracts search traffic but doesn't contribute to business goals. Consider adding CTAs, restructuring the content flow, or adjusting what you define as a conversion.
None of these patterns are visible in either dataset alone.
The traditional approach: Looker Studio or spreadsheets
Credit where due — the most common solution works, and many teams rely on it.
The Looker Studio path
Looker Studio (formerly Google Data Studio) connects to both GA4 and Search Console natively. You create two data sources, add them to a report, and blend them using the page URL as the join key.
Where it works well. If you already live in Looker Studio and have a reporting workflow built around it, adding a blended data source is a reasonable extension. The visualization options are decent, and the reports update automatically.
Where it gets difficult. URL matching is the first pain point. Search Console uses the full canonical URL (including trailing slashes, query parameters, and protocol). GA4 uses the page path (without domain, often without trailing slash). A page that's https://example.com/blog/my-post/ in GSC might be /blog/my-post in GA4. Your blend silently drops rows whenever the formats diverge.
Date alignment is the second problem. Search Console data lags by 2–3 days. GA4 data is near-real-time. If you blend on a daily granularity, the most recent days show GA4 data with no corresponding GSC data, which skews any rate calculations.
The third issue is scale. Looker Studio handles a few hundred rows well. When you have a 5,000-page site and want to analyze every page, the blending step slows dramatically, and the interface becomes difficult to navigate.
The spreadsheet path
Export GA4 landing page data as CSV. Export Search Console performance data as CSV. Import both into Google Sheets or Excel. Use VLOOKUP or INDEX/MATCH to join on URL. Manually normalize URLs first (lowercase, remove trailing slashes, strip query strings).
This works for one-time analysis. It does not work for ongoing monitoring because you'd need to re-export, re-import, and re-normalize every time you want fresh data. Most teams do this once, discover three interesting insights, and never update the spreadsheet again.
The automatic approach: native integration
Some tools — Evergreen among them — integrate GA4 and Search Console natively and handle the URL matching, date alignment, and ongoing sync automatically. The tradeoff is straightforward: instead of building and maintaining the data pipeline yourself, you let the tool do it.
The result is a unified view where every page row in your audit table includes both GA4 engagement data and GSC search performance data, updated on every sync. No export steps, no URL normalization, no broken VLOOKUPs.
The question is whether the data you get from the unified view is worth the shift in tooling. For teams running content audits, the answer is almost always yes — because the decisions you need to make require both datasets, and the manual join is the step where most audit workflows stall.
What to do with combined data
Having the data in one place is the prerequisite. Using it to make decisions is the point. Here are the five highest-leverage analyses you can run once GA4 and Search Console data are joined per page.
1. Find your best SEO candidates
Filter to pages with:
- High engagement rate (above 60%)
- Low GSC impressions (below 100/month)
- Existing content (word count above 500)
These are pages that convert visitors well but aren't visible in search. They're your best candidates for SEO improvement because the content already works — it just needs more traffic. Add internal links, optimize the title and description, expand the content around target keywords, and monitor impressions over the following weeks.
2. Fix your SERP snippet mismatches
Filter to pages with:
- High GSC impressions (above 1,000/month)
- Below-average CTR for their position range
- Low engagement rate (below 30%)
These pages are visible in search but failing twice: first at the SERP (low CTR) and then on the page (low engagement). The title tag, meta description, and content are all misaligned with user intent. Start by researching what query the page actually ranks for (GSC provides this), then rewrite both the SERP snippet and the page content to match that intent.
3. Detect content decay early
Sort pages by the delta between current-period and previous-period metrics. Look for pages where:
- GSC impressions declined more than 20%
- GA4 sessions declined more than 20%
- Both trends are negative over the same period
A decline in both metrics simultaneously is a strong signal of content decay — the page is losing search visibility and the remaining traffic is less engaged. This is different from a seasonality dip (where traffic drops but engagement stays stable) or a ranking fluctuation (where impressions drop but bounce back within a week).
4. Prioritize by business impact
If you have conversion events configured in GA4, you can calculate a rough "SEO revenue potential" for each page:
- Current conversion rate (from GA4 conversions data)
- Current search visibility (from GSC impressions and position)
- Potential traffic uplift if position improves (estimated from CTR curves)
- Projected conversion uplift = potential traffic × current conversion rate
This isn't a precise forecast, but it's a better prioritization framework than "fix all the pages with missing meta descriptions" — because it focuses effort on pages where improved rankings translate to measurable business outcomes.
5. Audit by section, not by page
Group pages by URL directory (e.g., /blog/, /products/, /docs/) and compare the aggregated metrics:
| Section | Avg. GSC impressions | Avg. CTR | Avg. engagement rate | Pages |
|---|---|---|---|---|
| /blog/ | 2,400 | 3.2% | 42% | 120 |
| /products/ | 800 | 5.1% | 68% | 35 |
| /docs/ | 1,200 | 4.8% | 71% | 85 |
| /resources/ | 150 | 1.8% | 28% | 45 |
This view reveals section-level patterns. In the example above, /resources/ has low visibility, low CTR, and low engagement — it's a candidate for restructuring or consolidation. /docs/ has strong engagement but moderate impressions — it might benefit from SEO optimization to increase its search visibility.
How Evergreen handles GA4 + Search Console data
Evergreen connects to GA4 and Google Search Console via native OAuth integrations. After you authorize both, the audit table automatically includes columns for GSC impressions, clicks, CTR, average position, GA4 sessions, engagement rate, and engagement time — alongside the standard crawl data (title, meta description, H1, word count, Lighthouse score, internal links).
The URL matching is handled during ingestion. Evergreen normalizes both GA4 page paths and GSC canonical URLs to a common format, so the data joins correctly even when trailing slashes, query parameters, or protocol differences exist.
On the Pro plan, daily syncs refresh both GA4 and GSC data alongside the crawl data. You can sort and filter across all columns — "show me pages with more than 500 GSC impressions and an engagement rate below 40%" is a single filter operation.
The combined data flows into shareable reports. If you're an agency sending a content audit to a client, the report includes search performance and user engagement data in the same view — no separate Looker Studio link, no attached spreadsheet.
Connect GA4 and GSC in one place → Start free
Frequently asked questions
Do I need GA4 and Search Console on the same Google account?
No. Evergreen connects to each service independently via OAuth. You can use different Google accounts for GA4 and Search Console — you just need read access to the relevant property in each.
How far back does the combined data go?
Search Console retains 16 months of data. GA4 retains data based on your retention settings (the default is 14 months, but you can set it to 2 months). The combined view in Evergreen reflects whatever date range both sources can provide.
Can I combine data from GA4 and Search Console without a third-party tool?
Yes. Looker Studio blends them natively, and manual CSV export plus spreadsheet joining works for one-time analysis. The limitation is ongoing maintenance — manual approaches require re-exporting and re-joining every time you want fresh data, and URL-matching issues silently drop rows. The tradeoff with a tool like Evergreen is automation and reliability versus the flexibility of building your own pipeline.
What about Universal Analytics data?
Universal Analytics stopped processing data in July 2023. If you need historical UA data combined with Search Console data, you'll need to work from exports — neither Google nor any third-party tool connects to UA properties anymore.
Your next step: see your GA4 and Search Console data in one audit table → Create free account
Related Topics in The Complete Content Audit Guide
The Only Content Audit Template You'll Ever Need
A ready-to-use content audit template with scoring frameworks, action categories, and step-by-step instructions for auditing any website.
What is a Content Gap Analysis? (And How to Do One)
Learn what content gap analysis is, why it matters for SEO, and how to identify missing content opportunities on your website.
Content Audit Without Spreadsheets: A Visual Approach
Spreadsheets are where content audits go to die. Here's how to run a content audit visually — with sortable views, filters, and a sitemap overlay instead of row 847.
How to Find Pages Missing Meta Descriptions (Site-Wide)
Find every page on your site that's missing a meta description — and decide which ones actually need one. A short, practical guide with a site-wide auditing workflow.
Bulk Meta Description Checker: Audit Every Page at Once
Checking meta descriptions one page at a time doesn't scale past a dozen URLs. Here's how to audit every meta description on your site in a single pass — catching missing, duplicate, and truncated descriptions before they cost you clicks.
Content Audit Template 2026 (No Spreadsheet Required)
A complete content audit template updated for 2026 — covering AI content, LLM visibility, and content decay. Use the downloadable spreadsheet or skip it entirely with a visual alternative.
Content Audit Checklist Built for Agencies
A 20-point content audit checklist designed for agencies managing multiple client sites. Covers discovery, audit execution, deliverable creation, and ongoing monitoring — with pricing math.
Content Decay Analysis: A Data-Driven Framework
Content decay has four distinct patterns, each with different causes and different fixes. This framework uses GA4 + GSC data to detect, diagnose, and act on each one.
How Often Should You Run a Content Audit?
The answer is continuously, with monthly reviews. Here's why the annual audit cycle is obsolete, when event-driven audits are necessary, and how continuous monitoring replaces the spreadsheet ritual.
