Why Google Search Console Impressions Dropped: September 2025

by Garrett Nafzinger

Around September 10, your Google Search Console impressions probably dropped. For some sites, the decline was dramatic, 30% to 50% or more, particularly on desktop. Your average position may have jumped from something like position 18 to position 11. If you’re wondering what happened and whether you should worry, here’s what to know.

Google Disables Data Collection Shortcut

Around September 10, 2025, Google removed the ability to use a URL parameter called num=100. This parameter let anyone add &num=100 to a Google search URL and retrieve 100 results, instead of the standard 10. Rank tracking tools like Semrush used this feature to gather search ranking data. One request pulled 100 results. Now those same tools must make 10 separate requests, multiplying their infrastructure costs by roughly 10 times.

The timing aligns almost perfectly with the impression drops site owners started seeing in Search Console. Many believed Google filtered this bot traffic (non-human views) from Google Search Console reports. The correlation between the num=100 removal and widespread impression declines suggests that wasn’t the case.

GSC Impressions Were Inflated

An analysis by SEO consultant Tyler Gargula, who examined 319 websites, found that 87.7% of sites experienced impression drops after the change. Desktop impressions took the biggest hit, while mobile impressions were affected to a lesser extent. This makes sense because most rank tracking tools default to desktop tracking.

A popular health information website we work with saw monthly impressions drop from 38,000 to 24,000. Their average position improved from 31 to 16, and their clicks stayed consistent.

We’ve always told clients that impressions represented an opportunity, visibility in search results that could translate to clicks. That idea still holds, but the numbers were inflated by rank tracking and automated software.

Why Google Made This Change

Google hasn’t issued a detailed explanation, but industry experts have several theories backed by circumstantial evidence.

Protecting competitive intelligence: Search results represent billions of dollars in R&D investment. When competitors, other search engines, and AI companies can efficiently scrape that data, Google loses control over valuable intelligence about which sites, content, and signals it values.

Fighting AI training data collection: Multiple sources in the SEO community noted that ChatGPT reportedly used services like SerpApi, which relied on bulk SERP data collection. AI companies need massive search data to train models and provide real-time search capabilities. The num=100 parameter was an open tap for that data collection.

A few days after disabling num=100, Google posted a job opening for a “Senior Engineering Analyst, Search, Anti-scraper” role. The job description explicitly mentioned analyzing search traffic patterns to identify scrapers, assessing their impact, and developing machine learning models to detect abusive behavior. The posting was quickly closed after gaining attention in the SEO community, but it signals Google’s intent to combat scraping at scale.

Improved Average Position Comes Down to Math

Your average position in Search Console might look significantly better now. This improvement is essentially a mathematical artifact, not a ranking boost.

The average position is calculated by dividing the total position values by the total impressions. When you remove thousands of inflated impressions from positions 50-100, where bots were checking rankings, the remaining legitimate impressions from positions 1-20 drive the average upward.

If your site appeared at position 12 for 1,000 real user impressions and position 75 for 8,000 bot impressions, your reported average was around 68. Remove the bot impressions, and suddenly you’re averaging position 12, where real searchers saw you.

Matthew Mellinger’s analysis of over 100 websites confirmed this pattern. Average position improvements were most dramatic for positions 21 and beyond, directly correlating with removing deep-position bot impressions.

What Business Owners Should Watch

Impressions were always a directional, but incomplete metric. They indicated visibility but not whether that visibility drove business value. Now that the numbers are more accurate, focus on metrics connecting to revenue.

Clicks from organic search: Track month-over-month and year-over-year changes in Google Search Console. Clicks represent real people who chose to visit your site. Stable or growing click volume despite lower impressions means your visibility hasn’t changed.

Conversion rates: Use Google Analytics to see how many organic visitors complete your goal actions, form submissions, phone calls, purchases, and bookings.

Revenue per channel: If you can attribute sales to organic search, track that revenue monthly and year-over-year. This is your bottom-line metric.

Landing page performance: Identify which pages drive the most valuable actions from organic traffic. Understanding what converts helps you prioritize content improvements and internal linking strategies.

Click-through rate for positions 1-10: For queries where you rank on page one, what percentage of searchers click your website? A low CTR despite a good position indicates that your title and meta description need work.

The num=100 change exposed how inflated impression data had become. The numbers you’re seeing now reflect what was happening for real searchers. If your impressions dropped but clicks held steady, nothing meaningful changed. If both dropped proportionally, that warrants investigation, but even then, the question isn’t “How do we get impressions back?” It’s “How do we get more qualified traffic that converts?”

Your rank tracker showing different numbers or Search Console impressions shifting doesn’t determine whether customers find you. Keep your attention on clicks, conversions, and revenue. Those metrics pay your bills.

What Is Still Happening

Some scraping likely continues. Third-party API providers like SerpApi have already developed workarounds, including a new “Google Fast Light API” that claims to retrieve 100 results in a single request through alternative methods. How long these workarounds remain functional is unclear, especially given Google’s stated focus on anti-scraping measures.

The broader pattern suggests Google is tightening control over search result access. In June 2024, they discontinued infinite scroll and removed the num=100 parameter in September 2025. They’re also hiring specialists to identify and block scraping patterns. These aren’t isolated technical decisions—they represent a strategic shift toward protecting their core product data.

This creates a significant obstacle for AI companies training models or providing real-time search alternatives. Access to comprehensive search result data at scale became much more expensive and easier for Google to detect and block

Need help interpreting your Search Console data?

Contact Garrett Digital for an SEO strategy that prioritizes results over vanity metrics.