Around mid-September 2025, without any official prior announcement, Google disabled a long-standing URL parameter, &num=100, which allowed users and third-party tools to request 100 search results on a single page. This seemingly minor technical tweak, has provoked a lot of discussions in the Search Engine Optimisation (SEO) community, disrupted multiple tools, skewed analytics data, and could force a fundamental re-evaluation of how success is measured. 

The Immediate Fallout: Broken Tools and Soaring Costs

For years, the &num=100 parameter was the foundation of rank-tracking software, providing an efficient way to crawl Google's Search Engine Results Pages (SERPs) and report on keyword rankings 100 results at a time. The removal of this shortcut fundamentally alters the mechanics of data gathering.

What has changed is the raw maths of data collection as broken down by Veronika Höller:

  • Before: 1 request = 100 results.
  • Now: 10 requests = 100 results.

This represents a "10x increase in crawl load", server power, and infrastructure required for SEO tool vendors to gather the same dataset. The immediate consequences have been widespread, with multiple reports of broken dashboards, missing ranking data, error states, and incomplete SERP reports. As the company Keyword Insights stated, "Google has killed the n=100 SERP parameter. Instead of 1 request for 100 SERP results, it now takes 10 requests (10x the cost)". 

This increased operational burden is expected to have financial repercussions. Tool providers face potential influx of infrastructure costs, which will likely be passed on to customers through subscription price hikes in the coming quarters. Some providers, like AccuRanker, have already adapted by changing their default tracking depth to the top 20 results. AccuRanker explained its reasoning: "Retrieving more results from Google requires pulling additional pages, and each extra page significantly increases costs".

 

The Ripple Effect: Unravelling the "Great Decoupling" in Google Search Console

Beyond the disruption to third-party tools, the change has had an unexpected impact on first-party data within Google Search Console (GSC). Around September 10th, the same time the parameter change was noted, many website owners observed a significant and sharp decline in desktop impressions, which was accompanied by a corresponding increase in average position. SEO consultant Brodie Clark noted, "... I’m seeing a noticeable decline in desktop impressions, resulting in a sharp increase in average position".

The leading theory within the community is that this drop reveals just how many impressions were being generated by bots from SEO and AI analytics tools. 

Here’s how it worked:
  • For an impression to be recorded in GSC, a link to a site simply needs to appear on the user's current page; it does not need to be scrolled into view to register.
  • When a rank-tracking bot loaded a page with 100 results using the parameter, a website ranking at position #99 would register an impression.
  • With the &num=100 parameter gone, these bots can no longer generate impressions for positions 11-100 on a single page load, causing the total impression count to drop dramatically.

This discovery has cast doubt on a recent phenomenon known as "The Great Decoupling", the widely discussed trend of GSC impressions rising over the last year without a matching increase in clicks. While many, including Google, pushed back on the idea that this pattern was caused by the introduction of AI Overviews, this new evidence suggests that the data may have been significantly "polluted" by bot impressions.

 

Why Did Google Make This Change?

While Google has remained silent, SEO experts have several theories about the company's motivation. The prevailing belief is that this is a deliberate move to:

  • Limit Scraping: Make automated data extraction from SERPs more difficult and costly for third parties.
  • Protect Infrastructure: Reduce the strain on Google's servers from the heavy-hitting requests made by large-scale scraping operations.
  • Tighten Data Control: Push SEO professionals toward Google’s own data channels, such as GSC and official APIs, rather than relying on external tools. As Veronika Höller put it, the message from Google is clear: "This is our data. Access it on our terms".
  • Combat AI Competition: Some speculate the move is a defensive measure against competing Large Language Models (LLMs) like ChatGPT, which scrape Google's index to ground their results. As analyst Ryan Jones suggests, "People are scraping so much, so aggressively for AI that Google is fighting back, and breaking all the SEO rank checkers and SERP scrapers in the process". Ray Grieselhuber argues, "the index is the prize," and Google is likely trying to protect its most valuable data asset

 

What Happens Next?

For those working in SEO, the more data we have, the more we can analyse, strategise and make an impact for our clients going forward. While this change in the short-term is impactful to keyword tracking as we know it, it’s likely to be resolvable in the longer term.

As an agency, we’ll be monitoring how this evolves, and will adapt our approach where needed, ensuring our clients continue to receive insightful and meaningful reporting. Working alongside our keyword tracking and reporting partners, we will continue to proactively share updates and set expectations on what future reporting may look like.

This article was based on several resources, namely Were we wrong about “The Great Decoupling” after all? Analyzing the impact of &num=100 by Brodie Clark, Google Modifies Search Results Parameter, Affecting SEO Tools by Matt G. Southern, Google Removes &num=100: A Small Change With Massive Consequences by Veronika Höller, Google Search Console Reporting Change Since 100 Results Broke by Barry Schwartz, Google tests forced pagination on SERPs by Ray Grieselhuber & Important Update: SERP Tracking Depth by The AccuRanker Team.

If you need help with your keyword tracking after this change, reach out to our SEO team today to see how we can support you.
 

Contact Us
yordan_chair

MEET THE
AUTHOR.

YORDAN DIMITROV

Yordan drives data-led SEO strategies at Reflect Digital, leveraging his previous experience across sectors such as leisure, e-commerce, and automotive to enhance visibility and ROI for clients. He is passionate about delivering measurable growth and combines technical expertise with creative insight to identify new ranking and traffic opportunities. Yordan’s aim is to ensure seamless project delivery by managing timelines, coordinating with teams across departments, and staying at the forefront of SEO trends to achieve exceptional results for clients.

More about Yordan
hellofresh-logo2
brakes_logo.svg
sunsail
uktv
nidostudent
rspca_logo

Have a project you would like to discuss?