When AI eats itself, what’s the future of content?

The media landscape is in flux. Again. Only this time, it feels more existential.
Over the past year, we’ve watched familiar titles disappear, newsrooms shrink, and journalists forced away from roles they adore. It was sad to see the news on Business Insider’s layoffs and TechCrunch European’s operation closure, citing ‘realignment and reinforcement’.
At the same time, we’ve seen a quiet but profound shift in how content is produced. AI is no longer just a helper, it’s increasingly the author. And that brings us to a new kind of crisis.
Some outlets have already been caught publishing entirely AI-generated articles, complete with fake quotes, made-up experts, and fabricated facts.
What’s more troubling is how easily these stories slipped through. They feel too real, and that’s exactly the problem.
Because if AI keeps training itself on content made by other AI tools, the quality spiral is inevitable. We’ll end up with a self-referential loop where errors compound, context disappears, and truth becomes… well, less and less meaningful. And when generative models pull from these flawed sources to answer questions, make decisions, or even shape reputations, that’s where things get dangerous.
Why brands should care
This is more than just a media industry issue, it’s a brand reputation issue, and so brands and companies need to care. What appears online about your company, your leaders, your values – whether it’s on a news site, in an AI-generated summary, or a chatbot result – can morph in ways you didn’t intend and can’t easily correct.
Think back to a time when you’ve Googled your company and found outdated or misleading information. Now, multiply that risk tenfold in a GenAI-powered world. The potential for hallucinations means that brands could wake up to find their story rewritten, literally.
So, what now?
This is not a doom-and-gloom manifesto. This is a call to attention. The media still matters. Arguably, it matters more than ever.
Because for all the speed and efficiency AI promises, it lacks something fundamental: judgment. It can’t ask tough questions. It can’t challenge assumptions. It can’t sniff out the spin. That’s what journalists do. They are, and always have been, our societal bullshit detectors. And that role is critical if we want to keep trust, nuance, and truth in the equation.
Let’s not rush. Let’s think.
We don’t need to hit pause on progress. But we do need to slow down the blind acceleration. Let’s approach AI with critical thinking, not just excitement. Let’s build systems that elevate credible sources, not just the most clickable ones. And let’s invest in media, as a foundational pillar of the new digital landscape.
In short, the media must not be overlooked in the GenAI era. Because if we want to live in a world where truth still matters, we’re going to need it more than ever.