Table of Contents
Another Fall in Organic Traffic Because People Get What They Need from Generative Summaries
Last November, I wrote about the impact generative AI was having on technology websites. Things have become tougher since with the introduction of generative summaries. Take Figure 1 as an example. I asked Google a question and instead of responding with a list of websites that might contain good answers, Google generates a summary overview of the available information. There’s no need to go anywhere near the article that I published on June 6 because there’s enough information available in the summary to answer the question for most people.

Bing has its own take on generative summaries. I didn’t use it as an example because Bing search results are so horribly bad, especially when it comes to finding content in my site.
The result of the Google changes is a further decline in website traffic. And it’s not just me saying that this is the case. A recent Bain & Company survey found that “80% of US consumers rely on “zero-click” search results, meaning they get the information they need from the search engine’s results page and don’t click through to another website.”
Bain attributes the change in user behavior to the effect of AI search engines and generative summaries, resulting in a 15% to 25% reduction in organic web traffic, or page views created by people who find a website through unpaid search engine results (the listings displayed by Google, Bing, and other search engines) rather than through paid advertising or other marketing channels.
Why Does Falling Organic Traffic Matter?
The thing about generative AI is that it can only generate based on knowledge that exists in its LLMs or can find in a website. Generative AI doesn’t create new knowledge: to some extent, generative AI steals and reuses the work done by many people to understand, analyze, document, and discuss information about all the different topics indexed by the search engines and eventually create those generative summaries.
The model works when search engines directed everyone to the source websites. Those who write are happy that the web views recorded for their site reflect interest in their work. They might also benefit from advertising on the site. Depending on the page views, the revenue from advertising might be enough to live on. More usually, it might cover the domain and hosting fees.
Sites run by commercial companies to publicize their offerings commonly publish information to attract people to the site. The quality of the information varies greatly. Some (CodeTwo Software is an example in the Microsoft 365 space) is well written and very useful. Other sites hype up the problems solved by their current product (the need to spend lots of money to manage Entra ID apps is a common theme today) or dramatically over-emphasize why their product is needed. One example in that category is a site that tells people to run the EDBUTIL utility to defragment Exchange Server databases (last needed with maybe Exchange 2003).
From what I can see from the data for several websites, new content still receives attention and high page views because it is often linked to notifications sent via email, Twitter, Bluesky, or other media channels. A few days later, that material will be absorbed by AI and become less valuable in terms of driving the page views that search engines once sent to the host sites.
Writers Will Stop Sharing Content
The point is that if people and companies don’t see a return on their investment, they won’t write as many articles as they have in the past. A well-written and researched article might take four to six hours to put together, and longer if some PowerShell or other code examples are needed. Who wants to put in that effort, or pay writers to do that work, if page view numbers are continuing to fall month-over-month. Life is too short to throw away hours of effort for no reward (fiscal or just the pleasure of knowing that people read your content).
A real strength of technical communities focusing on topics like Exchange, SharePoint, Teams, and development technologies has been the willingness of people to share their knowledge and expertise, except perhaps via paid subscriptions to Substack or Patreon sites where exclusive access to content can be offered, perhaps for a period before open publication.
If open access to knowledge weakens, we will all be worse off. No amount of generative AI can guide people to a solution that hasn’t ever been documented. The information in the LLMs will gradually degrade because less new knowledge is being publicly shared. Over time, new knowledge might become less and less available to the LLMs and generative AI will become less valuable because it can only output old material.
Publishing the 2026 Edition
For now, the content shared on office365itpros.com will remain public and open to all. I have considered using Substack to host articles that aren’t related to book updates, with free subscriptions to that content for people who buy the Office 365 for IT Pros eBook. We might still go down that route, but for now we’re concentrating on publishing the 2026 edition on July 1, 2025.
I’m interested in hearing what people think about the effect AI has on content that many depend on to do their job. Please let us know your thoughts by posting a comment.
I total agree with your and that’s really a challenge for content creators, if search engines are compiling an 0-click answer and users are no longer sent to the source. It will affect the business model of content sites with ads. Let page views, less ads, less income and it is really the question: Will Google, Bing and others give credits or even pay for using that data, because they require “fresh authentic content” for the future. I’m not sure, if a KI can learn an grow from other KI generated content. Maybe Google and other KIs will hires “human writers” as “content creator for their KI”? Who knows.
I run my http://www.msxfaq.de for nearly 30 years now and do not monetarize it directly. it is a “proof of work” and possible customers might like it and hire me. But even that is based on visitors.
But Tony, we both are 50+. let’s step back and watch, how the next generation will solve that 🙂
I agree that the issues caused by website harvesting by AI LLMs won’t cause me too much bother simply because I’m at the back end of my career. But I do care about the future of the technical community and how the actions of some very large companies deprive some websites of the chance to grow and prosper as they share invaluable technical information.
Tony, you are a MVP. Can’t Microsoft pay you for your free-of-charge instructions, tips and information?
Nope. But I don’t write for Microsoft to pay me as part of the MVP program. I write because I like to, and because the activity supports the development of the Office 365 for IT Pros eBook. It is helpful to write an article about a topic and then take the topic to the book. The text in the book might be longer (more detailed) or shorter than an online version, and Office 365 for IT Pros is updated monthly unlike web articles, but writing the initial article helps to frame the topic and clarify what’s important.
MVPs are not paid for their contribution to the program, nor should they be. However, MVPs are assessed on their contribution to the technical community, and if publishing not-for-payment articles on websites becomes less attractive because of fallin page views, then there will probably be less of this activity and less knowledge will be shared. That would be sad.
I wonder what’s there for Google to not let people go to sites and just read everything from AI summary? Do they serve their own ads to the users then? I don’t see them. And also, Google should be realizing the same thing that content quality will go down with time. So, they have something in mind to still continue with this approach. And it’s probably not pretty either.
Btw, your site is hard to find on Google too. I have multiple occasions when i knew i read about some news on your site at home and then at work trying to find that article, i would never get your site in results and would have to just go to your site and scroll through recent articles. Maybe because the name is in general terms. I never remember it exactly, so i put office 365 pro or something and get a lot of sites, but not yours 🙂
I think the issue with Google finding office365itpros.com is that each domain has a rating that conveys how authoritative it is. That rating takes years to build, and the lower the rating, the lower articles appear in Google’s list. For example, Microsoft’s domain (naturally) has a high rating, so its articles usually appear first or second in a list.
I am never satisfied with the response and summary of generative AI unless they are supported with links to credible websites like office365itpros, practical365 or petri or Microsoft itself. I always make it a point to visit the source and read the section or the entire article to satisfy my query. I prefer the other sites over Microsoft sometimes because they are well written – as if the author is talking to the reader. Which also the idea behind generative AI. But it is not the written word of an expert who has observed the technology at work first hand and based on experiments writes on the science behind the technology. Generative AI often reinforces what the is often an underlying assumption in a prompt. It is not a person who has complete understanding of a concept. It is combining its knowledge of language and grammar with what it can find on the web. It is not an expert and so it can hallucinate or engage in blasphemy (in technology). I use generative AI for its ability to search better. From experience I have found perplexity, gemini and ChatGPT do it better than copilot. Especially perplexity for its citations.
Generative AI cannot create. It can only repeat. And what it repeats can be wrong – very wrong – if the text it chooses is wrong. Generative AI can put some lipstick on the pig by making sure that grammar and spelling are correct, but it doesn’t have the domain expertise or awareness of how things interact inside a complex ecosystem like Microsoft 365 to realize when problems exist in text that it finds. And that’s how generative AI can deliver poor, misleading, inaccurate, or just plain wrong information in its responses. Like any data processing project, crap in equals crap out. If AI has great input information (like the reference documents you can give to many chat apps), then it can do a good job, just like Copilot does when it processes Outlook message threads or Teams meeting transcripts.
I wholeheartedly agree. I too see my site visits decline. As a fellow-MVP I love sharing stuff but have become more reluctant to do so given the Search engine trend to display AI generated summaries. Why bother writing new content if it gets stolen, despite copyright notices on each and every web page?
That being said, I have successfully fixed one issue: My site traffic had sky-rocketed when the first AI bots started to crawl. Not only did they steal my content, they also cost me extra hosting money! I’ve fixed this by adapting my robots.txt and it has brought down my site data consumption by 75% (!).
Is there any initiative against this (IMO bad) trend? If not, we should start one.
Good luck fighting against Meta, Google, and Microsoft… Putting good content behind a paid-for firewall seems like the only way forward to me.