AI Burnout Is an Emerging Crisis


VIEW IN BROWSER

Riley Coleman

Jan 2026

AI Burnout

the unseen impact of overwhelm

In this issue

AI burnout is on the rise


Drivers of AI Burnout


Downloadable Intervention Kit

The Silent Erosion

It’s 7 PM on a Tuesday.

You’ve just delivered the seventh iteration of a design that was already good at version three. The stakeholder asked for “just a few more variations”.... because the AI can generate them in seconds, so why not?

You close your laptop and feel… nothing. Just a bone-deep, hollow exhaustion that no amount of sleep can fix.

Something doesn’t feel right.

You can’t quite put your finger on it. The work is getting done faster. The tools are impressive. The stakeholders are happy.

But you; you feel hollowed out. Unmoored. Like you’re moving at a pace your nervous system wasn’t built for, making decisions without the space to think, producing work that technically succeeds but feels… empty.

You’ve been searching for the language to name it. To respond to it. To explain to your manager why “just use AI to speed things up” makes you want to quit.

If this feels familiar, you aren’t alone. We are witnessing a quiet crisis in our industry. It’s what I call The Silent Erosion.

We were told AI would liberate us from “drudge work,” but instead, it has accelerated the pace of production without acknowledging that the human brain operates on a linear metabolism. The truth is, your burnout isn’t a failure to keep up. It’s a biological protection mechanism against an impossible cognitive load.

Here’s the language you’ve been searching for.

The Diagnosis: What Is Different About AI Burnout

Traditional burnout—the kind we’ve studied for decades—had recognisable drivers: too much work, lack of control, misalignment between values and tasks. We understood it. We could name it.

AI burnout is different. It operates on a different axis entirely.


The Great Acceleration: The Productivity Paradox

Traditional burnout: Workload exceeds capacity. You have too many tasks to complete in the time available.

AI burnout: The productivity paradox - more work, at inhuman speed, with no recovery time.

Here’s the disconnect: Corporate leadership frames AI as a tool that reduces burnout by eliminating repetitive tasks. But this view overlooks what actually happens.

When a task that previously took four hours is reduced to four minutes by an AI copilot, the organisational response is rarely to grant you three hours and fifty-six minutes of cognitive recovery or “incubation” time.

Instead, the vacuum is immediately filled with demands for more tasks, more iterations, and a higher velocity of output.

But that’s not even the full picture. Because your job isn’t just “design work done faster.”

Your job is now:

  • Regular design work (which hasn’t gone away)
  • + Learning AI tools (constantly, because they change every few months)
  • + Adapting workflows in real time (because best practices don’t exist yet)
  • + Managing and quality-checking AI outputs (because you’re responsible for what it produces)
  • + Supervising what used to be your craft (which adds cognitive load, not removes it)

The workload is more. The pace is faster. And the recovery time is gone.

This is The Great Acceleration. It transforms design from a strategic, human-centred discipline into a commodity production line.

But here’s what the research doesn’t capture: It doesn’t feel right.

Your brain knows it needs time between creative acts - time to incubate, to reflect, to let ideas settle. AI eliminates that time. And without the language to name it, you can’t defend it. You just feel… wrong.

This isn’t just about pace. This isn’t just about workload. It’s about both, amplified, with the recovery mechanisms systematically eliminated.

Toxic Ambiguity: The Endless Iteration Loop

Traditional burnout: Lack of control over outcomes.

AI burnout: Loss of authority to define “done” and the erosion of creative confidence through toxic ambiguity.

AI’s ability to generate countless design variations in seconds has given rise to the “endless iteration loop.”

Stakeholders, presented with infinite options, begin to treat design outputs as “moving targets,” abandoning thoughtful critique for a high-speed cycle of preference-based feedback.

“Can we see it in blue?

What about this layout?

What if we tried…”

There’s always one more variation to explore. One more option to generate. And because the AI makes it trivially easy, stakeholders see no reason not to ask.

This dynamic strips you of your professional authority to define what constitutes “good” or “finished” work.

The research identifies this as toxic ambiguity - you’re trapped between the pressure for high-volume, rapid execution and a paralysing lack of strategic clarity or a stable definition of success. You’re expected to produce more, faster, but the goalposts keep moving.

But here’s what you feel: A creeping sense that your judgment doesn’t matter anymore.

If the AI can generate 50 variations in 30 seconds, how do you justify saying “this one is right”? You lack the language to defend the pause, the discernment, the judgment that used to be your value. So you keep iterating. And iterating. And iterating.

Designers report “never feeling finished” because there’s always one more variation to explore. The pause - the moment where you step back and say “this is right” - has been eliminated.

This isn’t just about iteration. It’s about the erosion of your creative confidence and professional authority.


The Identity Crisis: Responsibility Without Control

Traditional burnout: Misalignment between values and work.

AI burnout: Existential dread compounded by moral stress and the responsibility-control gap.

For years, your expertise was your craft. You knew typography. You understood colour theory. You could execute a vision with precision. That craft was your identity.

Now, the AI handles the pixels. It generates the layouts. It suggests the variations. And you’re left supervising, approving, overriding.

You’re responsible for the output. You’re accountable for the quality. But you didn’t make it - the AI did.

This creates what the research calls the responsibility-control gap: You’re held accountable for the ethical, legal, and functional outcomes of products shaped by AI. But you lack full control or understanding of the “black-box” systems generating those outcomes.

When something goes wrong, you lack the language to explain why you approved it, why you trusted it, why you didn’t catch the error. You’re responsible, but not in control. And that gap is psychologically unbearable.

But it goes deeper than accountability.

According to Self-Determination Theory, human well-being at work hinges on three basic psychological needs:

autonomy, competence, and relatedness.

AI adoption, when done poorly, directly undermines the first two:

  • Autonomy is eroded when AI is positioned as an “authority” that dictates creative direction. Your professional judgment becomes secondary to the algorithm’s output.
  • Competence is undermined when AI handles the craft. The skills you spent years developing; the ones that made you valuable, are automated. What’s left?

This isn’t just about “learning new skills.” It’s about the erosion of your professional identity.

When your expertise was your craft, and the craft is automated, your brain screams: “What am I for?”

And then there’s moral stress.

Moral stress is the tension between your moral convictions and the actions you’re expected to take within an AI-augmented system. It’s triggered when:

  • You’re pressured to accept and implement AI-generated recommendations, even when those outputs conflict with your professional judgment, ethical standards, or human-centred design principles
  • You’re asked to approve work you didn’t create and can’t fully explain
  • You’re held responsible for outcomes shaped by opaque systems

Repeated exposure to this stress doesn’t just create burnout - it creates moral injury, a condition characterised by the erosion of your moral identity and deep emotional exhaustion.

This is the Identity Crisis. And it’s not just about skills. It’s about who you are as a designer.

The Contradiction

You are being asked to operate at machine speed. But your nervous system, your creativity, your judgment; these operate on a human metabolism. And that mismatch is unsustainable.

And until now, you lacked the language to name it.

So here it is: The Silent Erosion.

The elimination of the pause. The space where creativity, ethics, and trust are built. The thing that separates “AI-generated output” from “work that matters.”

Now you have the language.

The Permission: This Future Is Not Inevitable

Here’s what you need to understand:

We are being sold a vision of the future where humans operate at machine speed. But that future is not inevitable - it’s a design choice. And it’s a bad one.

We can choose differently.

The organisations pushing this vision want you to believe that if you don’t adopt AI at breakneck speed, you’ll be left behind. That your competitors are moving faster. That the market demands it.

But here’s the truth, the one you can use to champion:

When everyone has access to the same AI tools, and everyone is optimising for speed, and everyone is accepting AI outputs without judgment - everyone produces the same mediocre result.

And that right there - that is a business problem. One that has metrics, competitive advantage and customer retention tied to it.

The Evidence

BCG research from 2025 shows it in real terms

  • 74% of AI initiatives failed to provide ROI. Not because of the technology. Because organisations didn’t invest in the human side. (thats a design problem we can help solve)
  • The 26% that succeed? They slowed down first. They invested in 2x LESS AI intiatives, but they invested deeply - up-skilling their people, redesigning their workflows and building in effective systems that produced quality. They protected the pause.
  • Companies prioritising transparency and human control see 6x higher revenue growth.
  • Phased rollouts have 63% higher user satisfaction and 41% lower failure rates compared to “all-in” approaches.

In 2024 when IBM’s Carbon Design System broke under the “generative variability” of AI, they didn’t just push through. They slowed down. They established new design principles, adaptive practices, rapid experimentation and systemised what worked. They respected the human metabolism of change.

And they succeeded where 74% fail.

The Core Truth

The pause is where the value lives.

  • Creativity lives in the pause. Because incubation requires cognitive recovery.
  • Ethics emerge in the pause. Because moral reasoning requires reflection, not reflex.
  • Trust is built in the pause. Because humans need time to calibrate their confidence in AI outputs.

The pause isn’t empty space. It’s where the human work happens. It’s where judgment, discernment, and meaning-making occur.

AI Slop was webster dictionary's 2025 word of the year.
The generic, valueless output flooding every industry - exists because there was no pause. No human judgment applied. No expertise brought to bear. No lived experience shaping the outcome.

Build AI Skills that protect the pause

$950.00

Human-Centred Trustworthy AI by Design

This is a new era of design.
It's no longer USER experience design
But Human+AI experience design ... Read more

The Invitation: Now Use It

You now have the language.

The business value for protecting the pause - Not corporate jargon. Not frameworks.

The words to name what’s been eating at you.

When your manager asks why you’re pushing back on “just a few more variations,” you don’t need to say “The Endless Iteration Loop is eroding my professional authority.”

You say: “We need to define what ‘done’ looks like before we start. Otherwise we’re just iterating based on preference, not strategy.”

When stakeholders wonder why you can’t just “speed things up with AI,” you don’t lecture them about The Great Acceleration.

You say: “The tool is fast. But good work still needs time to think. If we fill every minute with output, we lose the space where good ideas come from.”

When you feel your identity slipping, when you’re approving work you didn’t make and can’t fully explain... you don’t cite the Responsibility-Control Gap.

You say: “I’m accountable for this, so I need to understand how it works. I can’t approve what I can’t explain.”


This isn’t about memorising terms. It’s about permission.

Permission to say: “This pace isn’t sustainable.”

Permission to say: “I need to think before I iterate.”

Permission to say: “My judgment matters here.”

You’ve been searching for the language to defend the pause; the space where your value lives. Not because you’re slow. Not because you’re resistant to change. But because creativity, ethics, and trust don’t happen at machine speed.

They happen in the pause.


The organisations pushing you to move faster want you to believe this is inevitable. That if you don’t adopt AI at breakneck speed, you’ll be left behind.

But here’s what they’re not telling you: When everyone has the same tools, optimising for the same thing, accepting outputs without judgment - everyone produces the same mediocre shit.

Speed without judgment is just a race to the bottom.

Your competitive advantage isn’t velocity. It’s judgment. The ability to say “this is right” when faced with 50 options. The discernment to know when to stop. The wisdom to understand what the data can’t tell you.


You’re not a machine operator.

You’re a designer. And that means you define meaning, not just generate pixels.

So use the language. Protect the pause. Choose deliberately.

Because the future where humans operate at machine speed? That’s not inevitable.

It’s a design choice. And it’s a bad one.

We can choose differently.

What will you say tomorrow?

Want practical tools for reclaiming the pause?

Download the AI Burnout Intervention Toolkit.


Riley Coleman is an AI enablement strategist and founder of AI Flywheel, helping organisations shift from “how can AI solve this” to “how do we build an organisation ready for AI.”


How can this newsletter be of most value to you?

I want to ensure that this newsletter really valuable for you every week. To do that I would love your input on what you'd like to see included.

Marmora st, Freshwater NSW Australia
Unsubscribe · Preferences