Skip to content

/ GSC + AI

Your AI is only as good as the GSC data you paste into it.

Google Search Console caps what you can see: 1,000 rows in the UI, 16 months of history, and nearly half of all queries permanently withheld for privacy reasons. Every time you ask an AI to help with SEO, you start by copying truncated data into a chat window and hoping it's enough. That SEO Agent connects your AI directly to the Search Console API — so it queries your live data, filters by dimension, paginates past the UI limits, and works with what's actually there.

Data sources:Google Search Console APIGoogle Analytics 4PageSpeed InsightsLive crawler
~47%
of clicks come from anonymized queries Google permanently withholds
Ahrefs / Patrick Stox — 22B clicks, 887K properties (2025)
1,000
row cap in the GSC UI — the API exposes up to 50,000 rows per day
Google Search Console API documentation
50K
rows/day via API — 50× more than the UI allows
Google Search Console API — Usage Limits documentation
16mo
data retention limit — older data is permanently deleted by Google
Google Search Console Help — Performance report

01

What GSC data actually tells you — and what it doesn't

Google Search Console's Performance report exposes four metrics for organic search: clicks, impressions, click-through rate, and average position. These can be filtered and grouped across four dimensions: query, page, country, and device. The Search Console API extends this with RE2 regex filtering, Discover and Google News data, and the ability to paginate up to 50,000 rows per day per search type — well beyond what the UI shows. This is the official record of how Googlebot sees and indexes your site: not estimated, not modeled, drawn directly from Google's index and clickstream data.

Average position is the topmost position your page occupied in a given search result, averaged across all queries that generated an impression. The critical constraint, documented by Google, is that a position is only recorded when a user actually sees your result. If your page ranks 35th and a user never scrolls that far, that query contributes zero to your average. This means average position skews toward queries where you rank high enough to be seen — a drop in average position can mean you started ranking for more competitive terms, not that you lost ground on existing ones.

Google withholds queries that were not searched by "more than a few dozen users" over any two-to-three month period. These anonymized queries still count in your total click and impression figures — meaning your aggregate numbers look complete — but the specific phrases are never disclosed. An Ahrefs analysis of 22 billion clicks across 887,534 GSC properties in 2025 found that 46.77% of all traffic came from anonymized queries. In 2021, when researchers first surfaced the scale of this, Google updated its own help documentation to remove the word "very rare" from its description of anonymized queries — an acknowledgment that the term was not accurate.

GSC retains performance data for 16 months. Data older than that is deleted from Google's databases and cannot be recovered by any method. The API explicitly documents that even at full scale it "does not guarantee to return all data rows but rather top ones" due to internal system constraints. Data freshness adds another variable: new data typically becomes available two to three days after it is generated.

46.77%
of all organic clicks come from queries Google permanently withholds from GSC
Ahrefs — 22B clicks, 887K GSC properties (2025)
The traffic is counted in your aggregate totals. The query strings driving it are permanently unavailable — regardless of what tool you use to access GSC.

Stop analyzing GSC samples. Connect your AI to the full dataset.

02

Why manual GSC analysis breaks at scale

The standard workflow for using AI to analyze GSC data is: open GSC, apply filters, export up to 1,000 rows as CSV, open a spreadsheet or upload to a chat window, write a prompt that explains what the data represents, wait for analysis, then repeat with a different filter set if the first export was insufficient. Every session starts from scratch. The AI has no memory of the last export, no awareness of what filters were applied, and no ability to follow up by querying additional dimensions.

At 1,000 rows, a medium-sized site with 2,000 indexed pages cannot see all its page-level data in a single UI export. A site ranking for 50,000 queries cannot see the full query distribution. The workaround — segmenting by subfolder and running separate exports — multiplies the manual work linearly with site size. The API allows up to 25,000 rows per individual request and 50,000 rows per day per search type, with pagination for larger sets. This is not a marginal improvement: it is the difference between sampling and full analysis.

The most actionable GSC analyses require combining dimensions that the UI cannot combine in a single view: pages ranked 7–15 with high impression and low CTR; queries driving impressions to multiple pages simultaneously; page-level data matched against Core Web Vitals. Each requires exporting multiple filtered data sets and rejoining them in a spreadsheet, or writing API queries directly. Manual cross-referencing of even two dimension combinations across a large site takes 45–60 minutes to produce one analysis frame.

Every time an SEO professional pastes data into a chat tool, they spend time on context-setting that has nothing to do with analysis: explaining what the columns mean, specifying the date range, clarifying what filter was applied, noting what data is missing because of the row limit. This overhead compounds with site complexity — and because the AI tool has no connection to the data source, it resets completely between sessions.

50×
API vs. UI row access
The GSC interface caps exports at 1,000 rows. The Search Console API exposes up to 50,000 rows per day per search type with full pagination support. Connecting your AI directly to the API is not a convenience improvement — it changes the analytical surface entirely.
Source: Google Search Console API — Usage Limits (developers.google.com)

03

Before and after: manual briefing vs. live MCP access

Before — manual export cycle
  1. 01Open GSC Performance report. Apply filters for date range, device, and search type.
  2. 02Export up to 1,000 rows as CSV. If the site has more pages or queries, decide which segment to prioritize. The rest is invisible.
  3. 03Open the CSV in a spreadsheet. Sort, filter, or pivot as needed.
  4. 04Copy the relevant rows. Open a chat session with Claude or ChatGPT.
  5. 05Write a prompt explaining: what this data is, what date range it covers, what filter was applied, what is missing because of the row limit.
  6. 06Receive analysis. If the answer requires a different slice, return to step 1.
Result: analysis of a sample, manually assembled, no follow-up without re-exporting.
After — That SEO Agent via MCP
  1. 01Connect your GSC property once via OAuth in the dashboard. Generate an API key.
  2. 02Add the MCP server to your AI client — one JSON config block. Works with Claude Desktop, ChatGPT, Cursor, and any MCP-compatible client.
  3. 03Ask your question directly. 'Which pages rank 8–15 for queries with more than 500 monthly impressions and a CTR under 2%?' The agent queries the API live, applies the filters, paginates if needed.
  4. 04Follow up without re-exporting. 'Now cross-reference those with their Core Web Vitals scores.' The session context stays intact.
Result: full row set, live data, persistent context. The CSV export step disappears.

04

How to set it up in three steps

01
Connect your GSC property
Sign in to That SEO Agent with Google. Add your site and authorize the Google Search Console OAuth scope. The connection is read-only — it cannot modify your GSC settings, submit URLs, or alter any data. This grants access to the Search Analytics API, URL Inspection API, and sitemap status for the connected property.
02
Generate an API key and configure your client
In dashboard settings, generate an API key (prefixed sea_). Add the That SEO Agent MCP server to your client configuration — a single JSON block pointing to the server endpoint with your Bearer token. The full setup guide is at /dashboard/mcp. Supported clients: Claude Desktop, Cursor, ChatGPT, Windsurf, and any other MCP-compatible tool.
03
Ask your AI about your site
With the server connected, your AI has access to 44 SEO tools covering Search Console, GA4, PageSpeed, and live crawl data. Ask in natural language. The agent queries live data, not a snapshot. Note: anonymized queries remain withheld — that constraint exists at Google's data layer, not at the API or tool level.
Sources
Official documentation
Search Analytics: query — Search Console API (Google for Developers)developers.google.com
Getting your performance data — Search Console API (Google for Developers)developers.google.com
Usage Limits — Search Console API (Google for Developers)developers.google.com
What are impressions, position, and clicks? — Google Search Console Helpsupport.google.com
Performance report (Search results) — Google Search Console Helpsupport.google.com
Search Analytics API: Discover, Google News & Regex — Google Search Central Blog (Oct 2021)developers.google.com
Performance data deep dive — Google Search Central Blog (Oct 2022)developers.google.com
Model Context Protocol — Anthropic documentationdocs.anthropic.com
Primary research & industry
Anonymized Queries Make Up Nearly Half of GSC Traffic — Ahrefs / Patrick Stox (2025). Methodology: 22 billion clicks, 887,534 GSC properties.ahrefs.com
Almost Half of GSC Clicks Go to Anonymous Queries — Ahrefs (original 2022 study)ahrefs.com
Google removes 'very rare' language from hidden query documentation — Search Engine Landsearchengineland.com
How We Built an SEO AI Agent: One Tab, Zero Copy-Paste, 28% More Clicks — Seer Interactive (single test, unverified at scale)seerinteractive.com
FAQ

Frequently asked questions

FAQ

Does connecting That SEO Agent give it write access to my Search Console account?

No. The OAuth scope requested is read-only for Search Console data. That SEO Agent cannot submit sitemaps, remove URLs, or modify any GSC settings. The connection provides access to the Search Analytics API (performance data), URL Inspection API, and sitemap status reads only.

Will I see the ~47% of queries that are anonymized?

No. Anonymized queries are withheld at Google's data layer before they reach the API. No tool — including That SEO Agent — can access queries that Google has classified as private due to low search volume. The traffic driven by those queries is still counted in your aggregate click and impression totals, so your site-level numbers are complete. The individual query strings for low-volume searches are permanently unavailable.

The GSC API only returns 25,000 rows per request. What happens if my site has more queries than that?

The API supports pagination via the startRow parameter. That SEO Agent handles pagination automatically for queries that exceed 25,000 rows, up to the daily limit of 50,000 rows per search type per property. For sites operating at that scale, Google also supports a BigQuery bulk export directly from GSC settings — that pipeline handles full historical data warehousing but requires BigQuery setup on your end.

How is this different from exporting GSC data to a spreadsheet and using AI on it?

Three differences. First, the API exposes up to 50,000 rows per day versus 1,000 in the UI — the full distribution, not a top-N sample. Second, the AI can issue follow-up queries in the same session without a new export — context stays intact. Third, it can combine GSC data with GA4, PageSpeed, and crawl data in one session without manual assembly. The spreadsheet method gives you one frozen snapshot. The MCP method gives you a live data interface.

/ Get Access

Your AI already knows what to ask. Give it your actual data.

Connect your Search Console, GA4, and PageSpeed in one place. Ask anything.

Invite-only alpha. Join the waitlist.

THAT SEO AGENT

44 SEO tools for Claude, ChatGPT & Cursor. Connect GSC, GA4, and PageSpeed. Stop briefing AI about your own site.

That SEO Agent - Stop briefing AI about your own site. | Product Hunt
© 2026 THATSEOAGENT.COM · ALL RIGHTS RESERVEDBUILT WITH ♥ FOR SEO PROFESSIONALS