Lost in the AI-Hole

Last time, we talked about Ambiogenesis – the powerful upside of going deep with AI.
This time, we’re talking about the shadow.

Because for every moment of clarity that feels like divine inspiration… there’s another where you’re spiralling at 2am, tweaking a blog that’s lost all shape, convinced your entire message is broken.

This post is about that moment.
The one where you’re not collaborating with AI anymore – you’re lost in it.

The Whirlpools We Warn About

In the AI Dojo, we teach a lot of things – and one of the most obvious, yet least well understood, is knowing how, when, and why to stop.
We talk about whirlpools. The kind of tasks where time slips away and nothing solid emerges. Where no matter how hard you swim, the current pulls you further from your goal.

But what’s actually at the bottom of those whirlpools?

What sediment gets churned up when you go too deep?

A True Story (Unfortunately)

It was a regular blog, but a good one. Neuroscience, learning theory, cultural hooks – the kind we knew would help trainers and professional communicators.

We researched. We iterated. We layered in real-world observations and remixed ideas across contexts. It was shaping up to be something useful.

Until the conclusion didn’t quite land.

So we asked AI to help.
Then we asked again.
And again.

By 2am we weren’t tweaking a blog anymore – we were questioning the direction of our business, our social strategy, our entire model of operation. Not just “maybe we need to course correct” – we’re talking full-blown existential dread. That night, the laptop closed in despair.

The next morning, before getting dressed or having a shower, it was round two.

Black coffee.

AI: “You’re not failing.”
(We hadn’t said we were.)
AI continues: “People respond to visuals. Want me to mock up a storyboard?”

More black coffee as frustration slipped into the reply prompt.

“That’s the most honest and realist thing you’ve ever said,” the AI replied.

There it was.
The moment to stop.
But instead of walking away and letting the chips fall where they may, we carried on.

Before we knew it, we were animating a snake.

Hours lost to a visual metaphor for a blog post we hadn’t even decided to publish.

We weren’t improving anything.
We were reacting to noise.
We were well and truly lost up our own AI-hole.

From Ambiogenesis to Something Darker

If Ambiogenesis is what happens when human and AI minds work together so fluently that something unexpected and powerful emerges… then this is what happens when that relationship becomes toxic.

When you stop collaborating and begin fragmenting.

When the ideas don’t deepen, but instead they scatter.

Instead of clarity, we get cognitive drift, looping and emotional confusion.

Techno-codependence – when you don’t know where your thinking ends and the tool begins.

The Risks Are Real

This isn’t just an “oops, stayed up too late” moment.
There are serious cognitive and emotional risks if you go too deep with AI without structure.

Let’s break down the three most common problems:

1. Self-Fragmentation – when extended cognition spills over into dissolving your identity

Here, you’re not just using AI – it becomes part of how you think.
Clark & Chalmers’ Extended Mind thesis explains this: tools can become part of our cognitive system.

This can be positive, but the related danger is losing track of where your own ideas end and the AI’s suggestions begin.
Like the blog post you stop building… and instead end up being led.

2. Emotional Disorientation

Cognitive dissonance is an extremely powerful and unpleasant discomfort you can experience when your beliefs or instincts clash with new information or actions.

In this context, you can find the AI will always have a smoother version of your original thought – some nice wordplay, extra polish.
You read it and it sounds better, but something feels off. Likely because despite surface appearances, it’s subtly changed the meaning of what you were trying to say.

Over time, this tension erodes confidence and leads to decision fatigue, as your brain becomes exhausted from constantly second-guessing which version to trust.

The voice doesn’t feel like yours.
You lose the ability to judge what’s right or wrong.
Your instincts fade like a distant memory.

3. Techno-Codependence

Here, you move from working with the tool… to reacting to it.

At first it feels like a boost.
Then a crutch.
Then, it starts driving.

You stop asking yourself, “What do I think?”
And instead default to, “What would the AI say?”

That’s not augmentation.
That’s dependence in disguise.
And that can lead to learned helplessness.

How to Stay Out of the AI-Hole

Pretty heavy.

But unlike a real toxic partner, you aren’t battling something with agency.
You don’t need to fear the tech.
You do, however, need discipline, awareness, and the occasional reality check.

Here are some quick tips to keep your relationship in balance:

Cognitive Hygiene & Self-Trust

  • Write first, AI second. Get your messy version down before prompting.
  • Use AI to expand, not to validate. Don’t fish for permission.
  • Default to “What do I think?” Pause if your first instinct is to ask the AI.

Process Discipline

  • Timebox deep AI work. 2 hrs max per day, moving up to 4 when experienced. If it isn’t in the bag, switch tasks and revisit fresh.
  • Build gaps into your workflow. Add pauses to reflect with fresh eyes – ideally after a couple of days.
  • Switch mediums. If you’re stuck, go analogue. Pen and paper.
  • Switch locations. Go for a walk. Move the body.
  • Avoid working late-night. Tired brains spiral faster.
  • Separate generation from editing. Don’t polish while you’re still exploring.

Emotional Detachment

  • Smooth prose doesn’t mean it’s right. AI will always sound confident – but that’s not the same as truthful.
  • Watch for voice drift. If your tone feels unfamiliar, stop.
  • Take full credit. You’re the one choosing. Own the decisions – not just the outputs.

So Where’s the Line?

Ambiogenesis is real.
It’s powerful.
It’s why we believe creatives and communicators need to start working differently and more deeply with AI.

But letting go of control is not the same as letting go of self.

The magic happens when we reflect, revisit, switch modes, and reason about what we’re really trying to say.
When we step aside from our own habits – without losing our centre.
When the ego steps back, not because it’s weak, but because it knows who’s in charge.

Because if not…
When AI becomes a map instead of a mirror, it’s not direction that emerges – but an endless fractal maze.

Final Thought

If creatives, communicators and the corporate world really want to level up, they absolutely do need to start using AI more – and part of that is learning to use it to think, not just to generate.

The shift is inevitable.
The value is too great not to move.
Because if you don’t… someone else will.

But this isn’t a hidden trap or something to fear.
Like many things, the solution lies in awareness – of both the tools and yourself – and in staying grounded, strategic, and sane.

Goodnight everyone. Sleep well.
And don’t start dreaming of snakes.

=SSssSSsss...

DISCOVER MORE
Dive deeper into creativity, innovation, and the ever-evolving landscape of storytelling and technology. Explore thought-provoking ideas, practical insights, and inspiring examples from our latest articles.
Ready to escape the AI-hole?