September 2025 was yet another fascinating month in SEO that challenged some assumptions about AI search whilst reinforcing others. The August spam update finally completed after a lengthy 27-day rollout, Google made a technical change that broke numerous SEO tools, and research showed that traditional search remains far more dominant than the hype might suggest.
Let’s have a look through everything that happened and what it means for your business.
August Spam Update Completes After 27 Days
The August 2025 spam update, which began on 26th August, finally and officially completed on the 22nd September. That was a total rollout time of 27 days – longer than the “few weeks” Google initially suggested, and one of the longer spam update rollouts we’ve seen.
Now that it’s complete, we can assess the full impact. The update indeed hit hard, particularly targeting sites that had been creating content at scale using AI without adding sufficient human value. Some sites lost significant rankings and traffic, whilst others that were affected by last year’s December 2024 spam update actually saw recovery.
The recovery stories were particularly interesting. They illustrated that sites can recover from spam penalties if they identify and address the issues. By removing low-quality content and focusing on genuine value you can can go to rehab, in Google’s eyes.
The update reinforced Google’s message about AI content: it’s OK to generate and use AI content, but lazy implementation absolutely is not OK. websites using AI as a tool to enhance human-created content generally fared fine, whilst those using it to churn out hundreds of thin pages suffered. Yay!
What the spam update means for you: If you were negatively impacted, focus on recovery through quality improvement, not quick fixes. If you escaped unscathed, don’t become complacent then continue focusing on creating genuinely helpful content. The line between acceptable AI use and spam is becoming clearer: human expertise and value-add are non-negotiable.
Google Disables num=100 Parameter, Breaking SEO Tools
Around the 10th September, Google made a technical change that caused plenty of havoc in the SEO community: Google disabled the &num=100 results parameter, which had been used to display up to 100 search results on a single page instead of the standard 10.
Now this might sound like a minor technical detail, but it had massive implications. Many rank tracking tools relied on this parameter to gather data about where sites ranked beyond the first 10 positions. When Google removed it, these tools suddenly couldn’t function properly.
Tools like SEMrush, Ahrefs, and others experienced disruptions. Some rank tracking reports showed glitches, gaps in data, or misleading information. The SEMrush Sensor, for example, showed data problems from 10th September through 17th September.
Making matters more confusing, many SEO professionals also saw drops in Google Search Console impressions starting around 10th September, with average positions increasing accordingly. This wasn’t because rankings actually dropped, it was because the removal of the num=100 parameter affected how bot activity from SEO tools was being counted in the data.
Google stated that the &num=100 URL parameter was “not something that we formally support,” suggesting they never intended it to be used by tools in this way.
What this means for you: If you saw unusual Search Console data around 10th-20th September, particularly drops in impressions or changes in average position, this technical change was likely the cause, not actual ranking problems.
For ongoing tracking, you should still continue to use Search Console as your primary source of truth. Third-party tools are valuable, but they’re dependent on Google’s technical infrastructure. When Google makes changes like this, temporary data disruptions can occur. Most major tools have now adapted to work without the num=100 parameter.
Traditional Search Still Dominates: The 210x Reality
September brought us some much-needed perspective amidst all the AI search hype. New research confirmed that Google still handles approximately 16.4 billion daily searches, compared to ChatGPT’s roughly 66 million daily search-like prompts (with only 21.3% being true search-intent queries).
So do the maths: Google is about 210 times bigger than ChatGPT in actual search volume.
Moreover, even DuckDuckGo, a relatively minor player in the search engine world, generates more referral traffic than ChatGPT currently does.
Studies from the Nielsen Norman Group showed that growth in AI adoption has been slowing, with no month since September 2024 seeing more than 10% month-over-month increase in usage. The average number of searches per user on traditional search engines has actually increased over time, suggesting AI search is complementing rather than replacing traditional search behaviour.
What this means for you: Don’t abandon your traditional SEO strategy to chase AI trends. Yes, AI search matters and is growing. Yes, you should optimise for it. But the vast majority of search traffic, we’re talking 99%+ of actual search volume, still comes through traditional search engines.
Your resource allocation should reflect this reality:
- Primary focus (80%+): Traditional Google SEO,
- Important secondary focus (10-15%): AI Overviews in Google,
- Experimental focus (5-10%): ChatGPT and other AI platforms.
Adjust these percentages based on your actual traffic sources and business results, but don’t let the hype distort your priorities.
ChatGPT Market Dominance in the UK
According to Statcounter data released in September, ChatGPT enjoys an 80.55% share of the UK AI chatbot market. Microsoft Copilot holds a distant second place at 10.41%, whilst Perplexity, Google Gemini, Claude, and Deepseek combined account for less than 10% of market share.
This dominance is important for understanding where to focus any AI search optimisation efforts beyond Google. If you’re going to invest time understanding and optimising for AI platforms, ChatGPT should be your priority.
**What this means for you:** Make sure you haven’t blocked ChatGPT’s crawler (GPTBot) in your robots.txt file unless you have specific reasons to do so. Blocking it prevents your content from being used in ChatGPT’s training and from appearing in ChatGPT search results.
If you want visibility in AI search beyond Google, focus on ChatGPT first, followed by Gemini, then other platforms based on your audience and business type.
Search Quality Rater Guidelines Updated Again
Google updated its Search Quality Rater Guidelines in September, expanding the definition of YMYL (Your Money or Your Life) topics related to “society”. The updated language now includes more explicit coverage of topics that could negatively impact groups of people, issues of public interest, and trust in public institutions.
Additionally, Google added multiple examples of AI Overviews results throughout the documentation, complete with corresponding queries, user intent, and rating guidelines. This shows how seriously Google is taking the evaluation of AI-generated search features.
What QRG Updates mean for you: If you operate in YMYL categories like health, finance, legal, news, and civic issues, the standards for E-E-A-T (Experience, Expertise, Authoritativeness, Trust) remain extremely high. You need:
- Clear author credentials and expertise,
- Accurate, well-researched information,
- Transparent sourcing and citations,
- Regular updates to maintain accuracy,
- Demonstrable real-world experience.
For AI Overviews specifically, Google is clearly evaluating these features alongside traditional search results, suggesting they’re held to similar quality standards.
AI Search Adoption Slowing But Not Stopping
Research from Nielsen Norman Group published in September revealed some interesting patterns about AI search adoption:
- Growth in AI adoption has slowed, with no recent month showing more than 10% month-on-month (MoM) increase,
- Despite slower growth, usage hasn’t declined – it’s still increasing, just at a more moderate pace,
- AI search appears to be settling into a complementary role rather than a replacement role,
- Different demographics use AI search differently, with younger users more likely to adopt.
This data suggests we’re moving past the “shiny new toy” phase into a more mature understanding of when and why people use AI search versus traditional search.
What this means for you: AI search isn’t a flash in the pan, but it’s also not the immediate revolution some predicted. It’s a gradual evolution. This gives you time to adapt methodically rather than panic-react. Focus on building quality fundamentals that work across all platforms rather than chasing platform-specific hacks.
AI-Generated Images: No Direct Penalty, But Lower Value
Google’s Gary Illyes clarified in interviews during September that using AI-generated images doesn’t result in a direct SEO penalty. However, he confirmed that Google’s systems are developing ways to identify AI-generated images, and they’re likely treated similarly to stock photography – acceptable, but with less SEO value than original, unique images.
What AI Gen Images mean for you: If you’re using AI-generated images:
- They won’t get you penalised,
- They won’t help you rank in image search as well as original photos,
- Mix them with original photography for best results,
- Focus on image SEO basics: descriptive filenames, proper alt text, appropriate context.
For local businesses, original photos of your team, premises, work, and locality will always outperform AI-generated or stock images.
“The Great Decoupling” Explained
Throughout September, SEO professionals discussed what’s being called “the great decoupling”—the phenomenon where search impressions rise but clicks remain flat or decline. Many had attributed this entirely to AI Overviews taking clicks.
However, the num=100 parameter change revealed that tool-related bot activity had been artificially inflating impression counts, particularly on desktop. This means some of the “decoupling” we thought we were seeing was actually a measurement issue rather than purely an AI Overview impact.
That said, AI Overviews are genuinely causing some click decline – we have plenty of data confirming this. But the situation is more complex than initially thought, and some of what we were measuring as “lost clicks” was actually just bot activity being removed from the data.
What this means for you: Be cautious about over-interpreting single metrics. Impressions, clicks, CTR, and rankings all tell part of the story, but you need to look at the whole picture. Focus on business outcomes – leads, sales, conversions – not just intermediate metrics that can be affected by technical changes or measurement issues.
The Big Picture: Perspective Over Panic
September 2025 provided some much-needed perspective on where we actually are in the evolution of search:
- Traditional search remains dominant: By a massive margin. Don’t abandon what works for experimental channels that may or may not scale,
- AI search is real but complementary: It’s growing, it matters, but it’s not replacing traditional search in the near term,
- Quality standards are intensifying: The spam update showed Google is serious about enforcing quality standards, particularly around AI content,
- Measurement complexity is increasing: Technical changes can affect your data. Don’t react to every fluctuation without understanding the cause,
- Fundamentals work across platforms: Good content, proper structure, genuine expertise—these work for traditional search and AI search alike.
What You Should Do Now
Based on September’s developments:
Regarding the spam update:
- If you recovered, maintain your quality standards,
- If you’re still impacted, continue improving content quality and be patient,
- Review your AI content use to ensure human expertise is prominent.
Regarding tracking and measurement:
- Use Google Search Console as your primary data source,
- Cross-reference multiple tools rather than relying on just one,
- Focus on trends over time rather than day-to-day changes,
- Track business outcomes, not just SEO metrics.
Regarding AI search:
- Keep it in perspective – traditional search is still your main opportunity,
- Don’t block AI crawlers unless you have specific reasons,
- Optimise for both traditional and AI search using the same quality fundamentals,
- Monitor ChatGPT visibility if you have the tools to do so.
Regarding content quality:
- Ensure AI-assisted content includes genuine human expertise,
- Use original images where possible, especially for local businesses,
- Demonstrate clear E-E-A-T signals if you’re in YMYL categories,
- Create content that genuinely helps users, not just search engines.
Need Help Making Sense of Your Data?
If you saw confusing data changes in September – drops in impressions, changes in rankings, or conflicting signals from different tools – and you’re not sure whether it’s a real problem or a measurement issue, I can help you make sense of it.
Sometimes the biggest challenge in SEO isn’t improving your site, it’s understanding what the data is actually telling you. With technical changes like the num=100 parameter removal, it’s easy to misinterpret what’s happening and make unnecessary changes.
For local businesses in Farnborough, Hampshire, and the surrounding areas, I offer comprehensive analytics reviews, SEO audits, and ongoing consulting to help you focus on what actually matters for your business.
Give me a ring on 01252 692 765 or drop me a message through my contact form. Let’s have a conversation about your website’s performance and ensure you’re focusing your efforts where they’ll have the greatest impact.
Paul Mackenzie-Ross is an SEO consultant based in Farnborough, Hampshire, specialising in helping local businesses improve their online visibility and attract more customers through search engines and AI platforms.