Real news, real insights – for small businesses who want to understand what’s happening and why it matters.

By Vicky Sidler | Published 3 March 2026 at 12:00 GMT+2
If you have ever watched a teenager try to navigate their own neighborhood without a GPS app, you have seen the future of professional expertise. It is a world where we arrive at the correct destination but have absolutely no idea how we got there or what we passed along the way. In the world of higher education, a similar fog is rolling in.
While we have spent the last year panicking about students using ChatGPT to cheat on their midterms, we might be missing a much larger disaster. According to a recent study by researchers Nir Eisikovits and Jacob Burley from the Applied Ethics Center at UMass Boston, the real danger of AI in universities is not that students are getting unearned A's.
The real danger is that the very machinery of learning is being dismantled and replaced by a very efficient, very fast, and very hollow simulation of intelligence.
The focus on "cheating" ignores the bigger problem of how AI changes the way we learn.
Non-autonomous AI handles data but often hides bias and privacy risks behind a "black box."
Hybrid AI acts as a partner but can lead to "cognitive offloading" where we stop practicing hard skills.
Autonomous agents threaten the "ecosystem of practice" that turns beginners into experts.
Small businesses risk losing their "on-ramps" for new talent if AI does all the entry-level work.
👉 Need help getting your message right? Download the 5-Minute Marketing Fix.
Why AI is Quietly Eroding Your Team’s Ability to Think
The Three Flavors of AI Automation:
The Rise of the Hybrid Assistant:
The Disappearing On-Ramp for Expertise:
Why Marketing Shortcuts Are Actually Growth Killers:
Protecting Your Business Ecosystem:
1. AI Research Isn’t Better Than Yours (It Might Be Worse)
2. AI in Marketing Needs Human Thinking
3. AI Automates 80 Percent of Marketing—What Now?
4. AI Business Advice: Why It Helps Some Owners but Hurts Others
5. AI, Cybersecurity & Social Media Now Drive Small Business Growth
1. Can AI actually stop people from learning?
2. What is the biggest risk of using AI in small business?
3. How do I prevent AI from making my team less skilled?
4. Is using AI for marketing a bad idea?
To understand how this affects your business, we first have to look at how these systems are currently living in our institutions. The researchers break AI down into three categories based on how much power we give the machines.
The first is non-autonomous AI, which is essentially high-speed administrative plumbing. These are the systems that sort through piles of data to flag "at-risk" students or optimize a schedule.
In your business, this is the software that filters resumes or manages your inventory. While these tools save time, they are often "black boxes."
A black box is a system where you can see what goes in and what comes out, but you have no idea how the computer made its decision. This makes it very easy for hidden biases to creep in without anyone noticing until it is too late.
If you do not know why the machine picked Candidate A over Candidate B, you are not managing a process. You are just watching a magic show and hoping for the best.
Moving up the ladder, we find hybrid AI systems. These are the chatbots and writing assistants that feel like a helpful intern who never sleeps.
Students use them to summarize dense books, and faculty use them to draft syllabuses. This is where the "cheating" debate lives, but the deeper issue is something called cognitive offloading.
Cognitive offloading is a fancy way of saying we are outsourcing the "struggle" of thinking to a computer. It feels productive because the work gets done faster, but the brain is a muscle that only grows when it meets resistance.
If a student uses AI to write a paper and a professor uses AI to grade it, we have created a closed loop where no human is actually thinking.
In a small business, if your marketing assistant uses AI to write every post and you use AI to approve them, your brand loses its soul. Even worse, your team loses the ability to spot a bad idea when it sees one.
The most advanced stage involves autonomous agents. These are systems that do not just assist but actually act as independent researchers or workers.
While this sounds like a productivity miracle, it creates a massive problem for the future of expertise. Universities and businesses are not just "information factories" that spit out degrees or products.
They are systems of practice. This means they are places where people learn by doing the boring, repetitive, entry-level work.
If an AI handles all the "routine" tasks, we are effectively removing the "on-ramp" that allows a novice to become an expert.
If a junior designer never has to struggle with the basics because the AI does it in seconds, they will never develop the judgment needed to handle complex problems later.
We are trading our future leadership for a few extra minutes of free time today.
As a StoryBrand Certified Guide and Duct Tape Marketing Consultant, I see this play out in marketing every day. Small business owners often tell me they love AI because it makes marketing "easy."
But marketing should not always be easy. The "productive struggle" of defining your customer's problem and clarifying your message is exactly what gives you a competitive advantage.
If you let a machine do all the heavy lifting, you end up with "AI Slop"—content that looks right but says nothing. Your business grows when your people grow.
If you automate away the learning process, you are essentially trading your company’s long-term intelligence for a short-term productivity boost.
Real growth happens when you stay engaged with the message, even when it is difficult.
So, what is the purpose of a university or a small business in an automated world? If we view our companies as just "output machines," then full automation makes sense.
But if we view our businesses as communities that build human capacity and judgment, we have to be much more careful.
We owe it to our teams to ensure they are still doing the "work of learning." This means intentionally leaving room for "productive struggle."
We need to make sure our human experts are still "in the loop." We must ensure we are not accidentally destroying the pipeline of talent that will lead our businesses ten years from now.
Clarity in your business comes from deep thinking, not just fast clicking. If you want to ensure your message is sharp and your team is aligned without relying on a bot to guess what you mean, I have a tool for you.
Over-reliance on AI weakens critical thinking and makes people worse at spotting bad logic. Read this to understand how to keep your critical edge in an automated world.
AI is fast but it is not strategic. Discover practical ways to keep human brains in the loop to protect your brand's unique strategy and judgment.
When the routine tasks disappear, what is left for us to do? Explore how to preserve craft and learning inside a heavily automated workflow.
Outsourcing your judgment to AI can lead to expensive misdiagnoses. Learn why human expertise is still the only way to navigate complex business challenges.
This 2025 report reveals how AI can support lean teams without replacing the essential human capacity-building that drives real long-term growth.
Yes, through a process called cognitive offloading. When we use AI to handle the "struggle" of thinking—like drafting an essay or solving a complex problem—we skip the mental resistance required to build durable skills. Just as a GPS can make us less aware of our physical surroundings, over-reliance on AI can make us less capable of independent critical thinking.
The biggest risk is the erosion of expertise and judgment. If entry-level employees use AI to perform all their "routine" tasks, they miss out on the "on-ramp" experiences that turn novices into experts. Over time, your business may find itself with a shortage of people who actually understand the core mechanics of your industry.
The key is to keep humans "in the loop" by leaving room for productive struggle. Encourage your team to use AI as a starting point or a research assistant rather than a final producer. Set clear boundaries on which tasks require human drafting and ensure that all AI-generated output is rigorously checked for logic and context by a person.
It is not a bad idea if used strategically, but it becomes a "growth killer" when it replaces the hard work of clarifying your message. AI often produces generic "AI Slop" that lacks the unique brand voice and empathy required to truly connect with customers. Use it to brainstorm, but do the heavy lifting of messaging yourself.
A black box refers to an AI system where the user can see the input and the result, but the actual decision-making process inside the machine is hidden or too complex to understand. This is dangerous for businesses because it can mask data biases, leading to unfair hiring practices or flawed strategic decisions without anyone knowing why.
Clarity comes from a deep understanding of your customer's problems and how you solve them. While AI can help summarize these points, the most effective way is to follow a proven framework like StoryBrand. You can start by focusing on one sharp sentence that explains what you do and why it matters.
👉 To get started, you can download the5-Minute Marketing Fix for a practical, human-centered approach to clarity.

Created with clarity (and coffee)