Google Drops the num=100 Parameter: What Does It Mean for the SEO Community?
Over the past few days, you’ve probably noticed the SEO community buzzing across all social platforms about Google’s sudden decision to drop support for the num=100 parameter.
And if, for some reason, you haven’t been paying attention to social media—especially LinkedIn and X—you’ve likely seen a drop in impressions for the sites you monitor in Search Console.
Well, that’s precisely due to the discontinued support. Yes, that little trick we used to get 100 results per page instead of the usual 10 is now gone.
It may seem minor, but if we think it through, this change has significant implications for those of us working in web positioning, competitor analysis, and site audits.
So let’s break down what this parameter was, how it was used, and—most importantly—how this change might affect us.
What is num=100?
The num=100 parameter was a simple way to tell Google: “Show me more results on a single page, please (or not so politely).” Technically, this parameter was added to the end of a Google search URL (e.g., https://www.google.com/search?q=sports+shoes&num=100), and the search engine would respond by displaying up to 100 organic results instead of the usual 10. A gem for those of us who wanted to save time or gather lots of data in one go.
It was especially useful for SEOs, data analysts, and tool developers who needed access to a larger sample of SERP results without making multiple requests to Google.
This parameter worked both manually in the browser and through automated tools, as long as Google’s terms of service weren’t violated.
Unfortunately, Google has decided to discontinue this functionality, and it appears all results are now capped at 10 per page, regardless of the num parameter value.
What was num=100 used for?
I used num=100 regularly in my competitive analysis routines. It was extremely helpful for getting a full view of the SEO landscape for a keyword without having to paginate through results.
For example, if I wanted to see all the domains ranking for a specific keyword, num=100 let me do it in one shot.
It was also ideal for spotting patterns, analyzing content diversity in the SERPs, or identifying opportunities in lower positions that would otherwise go unnoticed.
Beyond that, the parameter helped me track keywords associated with specific URLs—especially newly published ones—giving me a more accurate view of their organic behavior without relying on sometimes unreliable data from tools like SEMrush.
Additionally, many SEO tools used it systematically to optimize their scraping processes. Not having to load 10 pages to see 100 results meant significant savings in time and resources.
It also improved the experience for those collecting data manually, since exporting a single page yielded a long, useful list. Without num=100, we’re back to making multiple requests, with a higher risk of blocks or captchas (and wasted time).
How does this affect websites?
From a webmaster or SEO consultant’s perspective, this change might seem small, but it has real implications. For starters, it makes it harder to monitor page visibility beyond the top 10 positions.
Sometimes a URL isn’t on the first page but still gets traffic from positions 11 to 30—or even lower. With num=100, you could easily confirm the exact ranking. Now it’ll take more time, more effort, and probably paid tools that may not be as accurate.
Another key issue is the loss of context. With a full list of 100 results, you could better understand search intent, competition level, and overall SERP structure. With only 10 results per page, the view is fragmented, which can lead to biased interpretations or decisions based on incomplete samples. For those of us doing deep SEO audits, this change adds another layer of friction to our daily work.
How does this affect tracking tools like SEMrush, Ahrefs, and Sistrix?
This is where the change really stings. Tools like SEMrush, Ahrefs, Sistrix, and others rely on scraping techniques (within legal bounds) to gather SERP data. Using num=100 allowed them to make fewer requests to Google’s servers and get more data per search.
Now, without that parameter, they need to make 10 times more requests to get the same amount of results. That means higher operational costs, increased risk of being blocked by Google, and likely less freshness or depth in the data.
If you use these tools frequently, you might start noticing limitations: fewer results in databases, less frequent updates, or even restrictions in analysis depth.
Some providers may choose to pass that extra cost on to users, which could mean higher prices or more restrictive plans. While platforms can still adapt, the process won’t be as efficient as before.
Conclusion
This change from Google, although seemingly technical and minor, has significant implications for the SEO community. Personally, I see it as another move toward closing off the ecosystem and making free, open access to SERP data more difficult.
We can no longer simply use num=100 to get a complete view of the results, which forces us to rethink our methods for analysis, scraping, and monitoring organic visibility. Google continues to shift toward a more controlled environment—which makes sense from their perspective—but it’s a hurdle for those of us who work with data.
Is it the end of the world? No. Is it annoying? Absolutely. The good news is that the SEO community has always been adaptable. The key is to stay informed about these changes, understand how they affect us, and adjust our strategies accordingly. If your day-to-day work depends on broad and reliable access to organic results, this change is a wake-up call to start exploring new solutions.
1. What is the num=100 parameter in Google?
The num=100 parameter was a code added to the end of Google search URLs to display 100 results per page instead of the traditional 10. This made it easier to collect extensive SERP data without having to navigate through multiple pages. It was especially useful for SEOs, developers, and analytics tools that needed a broad view of results in a single step.
2. Why did Google remove support for num=100?
Google decided to remove support for num=100 as part of its efforts to control access to SERP data. By limiting the number of results per page to 10, the company aims to reduce server load and prevent mass scraping of search results. This also reflects a broader trend at Google toward tighter control over its search data ecosystem.
3. How does this change affect me if I work in SEO?
This change primarily impacts data collection. If you’re an SEO professional, you’ll no longer be able to easily access 100 results per page. This makes it harder to gather relevant information about page rankings beyond the first page of results. You’ll need to make more requests to Google or rely on specialized tools, which may have additional limitations in data collection.
4. How does this affect tracking tools like SEMrush, Ahrefs, and Sistrix?
Tracking tools like SEMrush, Ahrefs, and Sistrix, which used the num=100 parameter to retrieve more results at once, now face an added challenge. Without this parameter, they must make more requests to access the same amount of data, which can increase operational costs, the risk of being blocked, and the likelihood of less frequent data updates.
5. What alternatives exist to continue getting complete Google results?
Although this change makes data collection more difficult, alternatives still exist. Google’s official APIs can provide SERP data, though they often have limits on the number of queries or results. Additionally, many SEO tools have begun adapting their scraping processes to continue operating efficiently. Another option is to use scraping techniques cautiously, always respecting Google’s terms of service.
Relacionado
You May Also Like
The Impact of Atlas on PPC Campaigns: How It Will Affect Your Costs and Digital Security
26 October, 2025
Thinking about implementing llms.txt? ? Read this first: You might not get the results you’re expecting
23 August, 2025


