Real news, real insights – for small businesses who want to understand what’s happening and why it matters.

By Vicky Sidler | Published 4 February 2026 at 12:00 GMT+2
There’s something oddly familiar about the noise around AI right now. It’s loud. It’s everywhere. And it’s starting to smell a bit like crypto just before that one cousin you never hear from tried to sell you an NFT of a cartoon slug.
When a technology promises to change everything overnight, you can usually count on two things: someone’s making a fortune, and someone else is about to be blamed for it going wrong. AI is no exception. If you’ve felt a creeping sense of confusion while watching executives use the word “disruption” like it’s a tax deduction, you’re not imagining things.
Cory Doctorow, the author and longtime tech critic, recently laid it all out in a lecture adapted for The Guardian. His argument? AI isn’t the miracle we’re being sold. It’s a bubble—and when it bursts, the fallout won’t be philosophical. It’ll be economic.
Let’s walk through what that means for actual business owners, the ones who don’t get paid to use words like “synergy” with a straight face.
Big tech is using AI to inflate stock prices, not help your business
Most AI tools can’t replace humans—they just shuffle the work and blame
When the bubble bursts, some tools will still be useful. Most won’t.
The goal isn’t to ban AI. It’s to stop bosses from using it to justify bad decisions.
👉 Need help getting your message right? Download the 5-Minute Marketing Fix.
The AI Bubble Will Burst—Here’s What Small Businesses Should Learn
Why AI Is Being Pushed So Hard (Hint: It’s Not About You):
Reverse Centaurs and Other Nightmares:
The AI Isn’t Smart. It’s Just Confident.
AI Art Isn’t the Product. It’s the Billboard.
Don’t Bet Your Business on Hype:
Frequently Asked Questions About the AI Bubble and Small Business
1. What is the AI bubble and why does it matter for small businesses?
2. Can AI really replace employees in a small business?
3. Are there any AI tools that are actually useful for small business owners?
4. Why are big companies investing so much in AI if it doesn't work well?
5. What does "reverse centaur" mean and how does it apply to business?
6. Will AI-generated content affect copyright rights for my business?
7. Should I stop using AI altogether in my marketing or operations?
Big tech companies are terrified of becoming ordinary. The day Google or Meta stops growing is the day their stock prices take a nosedive—and no one wants to be the executive blamed for that particular $240 billion loss. So they invent new stories. First it was crypto, then the metaverse, then NFTs. Now, it’s AI.
The pitch goes like this: fire half your team, pay an AI company to do the same work, and pocket the difference. Investors love it. CEOs love it. And your cousin Graham, who once tried to sell you Dogecoin during a family funeral, really loves it.
But as Doctorow points out, this version of AI doesn’t actually exist. Most AI systems can’t do the job they’re being hyped to replace. What they can do is just enough to convince someone else—usually a boss—that it’s fine to pretend they can.
In theory, AI should help us do our jobs better. That’s the centaur model: human and machine, working together. But what’s really happening is the opposite. Humans are being turned into support staff for machines that aren’t as smart as we think.
Imagine being a radiologist asked to double-check every AI diagnosis. If the AI catches something, great. If it misses something, you take the fall. You’re not enhanced. You’re expendable. That’s what Doctorow calls a reverse centaur, and it’s not a future anyone should be cheering for.
This model doesn’t stop at healthcare. It’s everywhere: delivery drivers tracked for blinking too much, software engineers reviewing code written by tools that make errors indistinguishable from real instructions, artists told their jobs “shouldn’t have existed anyway.”
AI systems don’t understand context. They don’t think. They pattern-match. That’s it. Which means they’re prone to “hallucinations”—a polite way of saying they make things up that sound correct but aren’t.
In software, for example, AI will often invent code libraries that follow a predictable naming format—but don’t actually exist. Hackers know this. They create malicious versions with those fake names. The AI pulls them in, the program runs, and suddenly your customer data is being streamed to someone’s basement in Belarus.
The people most qualified to catch those bugs? Senior developers. The same ones getting laid off in bulk to justify the AI investment. It’s almost poetic, in a Greek tragedy kind of way.
You’ve seen AI-generated images: wolves with six legs, eyeballs looking in twelve directions, everything tinted like someone spilled acid on a Pinterest board. These aren’t meant to replace artists. They’re meant to impress investors and frighten creative workers into saying, “Wow, this tech is amazing.”
Doctorow points out that illustrators aren’t exactly raking in millions. Replacing them doesn’t save companies money. It just sells the idea that AI is powerful and inevitable.
But AI art has no intent. No message. No real expression. It feels like something should be there—but there isn’t. As Doctorow puts it, it’s eerie. It pretends to be meaningful because it mimics what meaningful things look like.
This kind of performance is useful if you’re trying to inflate a valuation. It’s less useful if you’re, say, trying to market a business that doesn’t want to look like it was designed by a bored robot with a glitch.
Some people think the solution is new laws—making AI training illegal unless you get permission. Doctorow isn’t buying it.
Giving more copyright power to artists sounds good, until you realise who really owns the rights. Spoiler: not the artist. If anything, it just gives big media more control.
What actually works? Forcing companies to pay humans. Right now, AI-generated content can’t be copyrighted in the US. That means if Disney uses AI to make a film, anyone can copy and sell it. The only way they get protection is by hiring humans. That’s where we hold the line.
Eventually, this bubble will pop. The datacenters will be sold. The companies will shrink. The stock prices will fall. But it won’t all be a loss.
What we’ll likely keep are the smaller, practical tools: auto-captioning, image cleanup, transcription, and summarisation. These will run locally. They’ll be affordable. And they’ll be useful in the same way that spellcheck and screen recorders are useful—helpful, but not headline-worthy.
The real win is using these tools to support human work, not replace it. That’s what small businesses should be watching for. Not the flash, but the function.
If you’re a business owner wondering whether to jump on the AI bandwagon, ask yourself a simpler question: does this tool save me time without creating more problems?
If it does, great. Use it. If it doesn’t, or if it only sounds good in a pitch deck, move on. The last thing you need is to become the “human in the loop” blamed for an AI screw-up that never needed to happen.
And if the hype is making your messaging harder, not easier, it might be time to cut through the noise.
Download my 5-Minute Marketing Fix and get one clear sentence that explains what your business does, why it matters, and how you actually help.
Because you don’t need to sound futuristic. You just need to be understood.
This expands on the AI bubble article by showing how AI tools are already being rolled out with little regulation. If you’re in South Africa, or anywhere similar, you need to know what’s happening behind the scenes.
Once you’ve grasped the hype problem, this dives into the hidden cost: your data. Learn what you’re handing over when you use “free” AI tools—and what that means for your business.
You’ve read why most tools can’t replace humans—now see the test results. This article gives real-world examples of what AI gets right (and wrong), so you can make smarter decisions.
Knowing what not to do is only half the battle. This post lays out what actually works right now for marketing small businesses—no buzzwords, just results.
Doctorow mentioned copyright and AI. This post shows you how copyright actually plays out in real life when you get a threatening email. It’s the operational side of the policy conversation.
The AI bubble refers to the inflated hype and investment around artificial intelligence—especially tools that are marketed as replacements for human workers. It matters because small businesses are being pushed to adopt tools that may not deliver on their promises.
Not realistically. Most AI tools aren’t capable of replacing skilled human judgment. They might support basic tasks, but relying on them to fully replace people usually leads to poor outcomes and new problems.
Yes. Tools that handle transcription, summarising, basic image editing, or admin automation can genuinely save time. The key is to use AI for support, not decision-making or client-facing work.
Because stock prices rely on future growth. When large companies run out of new markets to dominate, they promote the next big thing—like AI—to keep investors excited, even if the tech isn’t ready.
A reverse centaur is when a human serves a machine, not the other way around. In business, it means workers are forced to supervise or clean up after AI tools that don’t work properly, while still being blamed for mistakes.
Yes. In the US, AI-generated works aren’t eligible for copyright. That means if your business uses AI to create content, others can legally copy and reuse it. To keep copyright protection, your content must show clear human authorship.
Not necessarily. Use it for what it’s good at—repetitive tasks, drafts, data summaries. But avoid relying on it for final outputs, legal decisions, customer communication, or anything that requires critical thinking or nuance.
Ask yourself whether the tool saves time without adding risk or confusion. Test it on low-stakes tasks first. Check who owns the data you upload. And make sure it complements your team—not replaces the parts your customers actually value.

Created with clarity (and coffee)