/ GSC + AI
Your AI is only as good as the GSC data you paste into it.
Google Search Console caps what you can see: 1,000 rows in the UI, 16 months of history, and nearly half of all queries permanently withheld for privacy reasons. Every time you ask an AI to help with SEO, you start by copying truncated data into a chat window and hoping it's enough. That SEO Agent connects your AI directly to the Search Console API — so it queries your live data, filters by dimension, paginates past the UI limits, and works with what's actually there.
01
What GSC data actually tells you — and what it doesn't
Google Search Console's Performance report exposes four metrics for organic search: clicks, impressions, click-through rate, and average position. These can be filtered and grouped across four dimensions: query, page, country, and device. The Search Console API extends this with RE2 regex filtering, Discover and Google News data, and the ability to paginate up to 50,000 rows per day per search type — well beyond what the UI shows. This is the official record of how Googlebot sees and indexes your site: not estimated, not modeled, drawn directly from Google's index and clickstream data.
Average position is the topmost position your page occupied in a given search result, averaged across all queries that generated an impression. The critical constraint, documented by Google, is that a position is only recorded when a user actually sees your result. If your page ranks 35th and a user never scrolls that far, that query contributes zero to your average. This means average position skews toward queries where you rank high enough to be seen — a drop in average position can mean you started ranking for more competitive terms, not that you lost ground on existing ones.
Google withholds queries that were not searched by "more than a few dozen users" over any two-to-three month period. These anonymized queries still count in your total click and impression figures — meaning your aggregate numbers look complete — but the specific phrases are never disclosed. An Ahrefs analysis of 22 billion clicks across 887,534 GSC properties in 2025 found that 46.77% of all traffic came from anonymized queries. In 2021, when researchers first surfaced the scale of this, Google updated its own help documentation to remove the word "very rare" from its description of anonymized queries — an acknowledgment that the term was not accurate.
GSC retains performance data for 16 months. Data older than that is deleted from Google's databases and cannot be recovered by any method. The API explicitly documents that even at full scale it "does not guarantee to return all data rows but rather top ones" due to internal system constraints. Data freshness adds another variable: new data typically becomes available two to three days after it is generated.
Stop analyzing GSC samples. Connect your AI to the full dataset.
02
Why manual GSC analysis breaks at scale
The standard workflow for using AI to analyze GSC data is: open GSC, apply filters, export up to 1,000 rows as CSV, open a spreadsheet or upload to a chat window, write a prompt that explains what the data represents, wait for analysis, then repeat with a different filter set if the first export was insufficient. Every session starts from scratch. The AI has no memory of the last export, no awareness of what filters were applied, and no ability to follow up by querying additional dimensions.
At 1,000 rows, a medium-sized site with 2,000 indexed pages cannot see all its page-level data in a single UI export. A site ranking for 50,000 queries cannot see the full query distribution. The workaround — segmenting by subfolder and running separate exports — multiplies the manual work linearly with site size. The API allows up to 25,000 rows per individual request and 50,000 rows per day per search type, with pagination for larger sets. This is not a marginal improvement: it is the difference between sampling and full analysis.
The most actionable GSC analyses require combining dimensions that the UI cannot combine in a single view: pages ranked 7–15 with high impression and low CTR; queries driving impressions to multiple pages simultaneously; page-level data matched against Core Web Vitals. Each requires exporting multiple filtered data sets and rejoining them in a spreadsheet, or writing API queries directly. Manual cross-referencing of even two dimension combinations across a large site takes 45–60 minutes to produce one analysis frame.
Every time an SEO professional pastes data into a chat tool, they spend time on context-setting that has nothing to do with analysis: explaining what the columns mean, specifying the date range, clarifying what filter was applied, noting what data is missing because of the row limit. This overhead compounds with site complexity — and because the AI tool has no connection to the data source, it resets completely between sessions.
03
Before and after: manual briefing vs. live MCP access
- 01Open GSC Performance report. Apply filters for date range, device, and search type.
- 02Export up to 1,000 rows as CSV. If the site has more pages or queries, decide which segment to prioritize. The rest is invisible.
- 03Open the CSV in a spreadsheet. Sort, filter, or pivot as needed.
- 04Copy the relevant rows. Open a chat session with Claude or ChatGPT.
- 05Write a prompt explaining: what this data is, what date range it covers, what filter was applied, what is missing because of the row limit.
- 06Receive analysis. If the answer requires a different slice, return to step 1.
- 01Connect your GSC property once via OAuth in the dashboard. Generate an API key.
- 02Add the MCP server to your AI client — one JSON config block. Works with Claude Desktop, ChatGPT, Cursor, and any MCP-compatible client.
- 03Ask your question directly. 'Which pages rank 8–15 for queries with more than 500 monthly impressions and a CTR under 2%?' The agent queries the API live, applies the filters, paginates if needed.
- 04Follow up without re-exporting. 'Now cross-reference those with their Core Web Vitals scores.' The session context stays intact.
04
How to set it up in three steps
Frequently asked questions
Does connecting That SEO Agent give it write access to my Search Console account?
No. The OAuth scope requested is read-only for Search Console data. That SEO Agent cannot submit sitemaps, remove URLs, or modify any GSC settings. The connection provides access to the Search Analytics API (performance data), URL Inspection API, and sitemap status reads only.
Will I see the ~47% of queries that are anonymized?
No. Anonymized queries are withheld at Google's data layer before they reach the API. No tool — including That SEO Agent — can access queries that Google has classified as private due to low search volume. The traffic driven by those queries is still counted in your aggregate click and impression totals, so your site-level numbers are complete. The individual query strings for low-volume searches are permanently unavailable.
The GSC API only returns 25,000 rows per request. What happens if my site has more queries than that?
The API supports pagination via the startRow parameter. That SEO Agent handles pagination automatically for queries that exceed 25,000 rows, up to the daily limit of 50,000 rows per search type per property. For sites operating at that scale, Google also supports a BigQuery bulk export directly from GSC settings — that pipeline handles full historical data warehousing but requires BigQuery setup on your end.
How is this different from exporting GSC data to a spreadsheet and using AI on it?
Three differences. First, the API exposes up to 50,000 rows per day versus 1,000 in the UI — the full distribution, not a top-N sample. Second, the AI can issue follow-up queries in the same session without a new export — context stays intact. Third, it can combine GSC data with GA4, PageSpeed, and crawl data in one session without manual assembly. The spreadsheet method gives you one frozen snapshot. The MCP method gives you a live data interface.
/ Get Access
Your AI already knows what to ask. Give it your actual data.
Connect your Search Console, GA4, and PageSpeed in one place. Ask anything.
Invite-only alpha. Join the waitlist.