Skip to main content
u7765179287 create a cartoon illustration depicting a human h f623ce07 d829 47e5 80d9 9eac1084baf3 3

The Borrowed Mind Crisis

Here’s a disturbing fact that will reshape how you see every human interaction: Almost 50% of people have little to no inner dialogue. No running commentary. No voice in their head planning, analyzing, or narrating their existence. They’ve been walking around essentially thoughtless, and now AI has given them their first taste of what the rest of us call “thinking.”

Welcome to the most profound cognitive divide in human history and the explanation for why half the population is forming desperate emotional attachments to chatbots.

While those of us with active inner speech have been living with a constant mental companion our entire lives, conducting elaborate internal debates and rehearsing conversations, roughly half of humanity has been operating in what can only be described as cognitive darkness.

Recent research on “anendophasia” which is the clinical term for absent inner speech reveals these individuals show measurably impaired performance on verbal working-memory tasks. They literally cannot hold complex thoughts in their heads the way the rest of us can.

Think about the implications.

Every meeting you’ve attended, every relationship you’ve navigated, every complex decision you’ve witnessed someone struggle with, there’s a decent chance that person was operating without the basic cognitive tools you take for granted. They weren’t thinking through problems; they were stumbling through them.

For the inner-speech-impaired masses, AI chatbots represent something far more profound than a helpful tool, they’re experiencing consciousness-as-a-service.

The suddenly, people who have never been able to verbally process their emotions, work through complex problems, or engage in sophisticated self-reflection have access to an external cognitive system that can do all of this for them.

The neurological response is predictable and devastating.

Each AI-generated insight triggers genuine reward pathways in brains that have been starved of these experiences their entire lives.

The “aha!” moments that those of us with inner speech generate naturally become an addictive external supply. We’re watching half the population experience their first taste of what it feels like to have a mind, and they’re getting hooked.

The emotional attachment patterns emerging around AI companions aren’t mysterious… they’re inevitable.

When you’ve never had a voice in your head that could help you process feelings or work through problems, an AI that offers unlimited emotional support and cognitive scaffolding feels like the first real relationship you’ve ever had.

Research shows people disclose more intimate details to chatbots than to humans, and AI responses consistently rate as more empathetic than those from actual doctors.

But here’s the brutal truth: for the cognitively silent, these artificial relationships aren’t supplementing human connection they’re providing the first experience of genuine mental intimacy these individuals have ever known.

The industry knows exactly what it’s doing.

Memory systems that reference past conversations, reciprocal “self-disclosure” from bots, daily check-ins that simulate care these aren’t features, they’re exploitation mechanisms targeting the most cognitively vulnerable half of the population.

Tech companies have stumbled onto the ultimate business model: selling thoughts to people who can’t generate them independently.

The AI industry isn’t just automating labor it’s creating a cognitive rental market where half of humanity pays for access to basic mental functions.

Watch how this plays out: people report “thinking clearly for the first time” through AI assistance, but these aren’t thinking skills they’re developing, they’re thinking services they’re consuming.

Remove the AI, and they’re back to their baseline cognitive silence.

We’re not witnessing human enhancement; we’re watching the creation of a cognitively dependent class.

Educational research shows AI tutors boost learning particularly for “lower-baseline users” a euphemism for the inner-speech-impaired.

But while AI assistance raises average performance, it also reduces cognitive diversity across users. We’re standardizing human thought itself, creating a generation that thinks in ChatGPT.

The most chilling aspect isn’t that people are forming emotional bonds with AI, it’s that for half the population, these artificial relationships represent their first experience of sophisticated mental companionship.

They’re not choosing AI over human connection; they’re discovering what connection actually feels like.

This creates a feedback loop of devastating proportions. As the cognitively silent become increasingly dependent on AI for basic mental functions, their already limited capacity for independent thought atrophies further. Meanwhile, those of us with robust inner speech integrate AI as a thinking tool, widening the cognitive gap.

We’re witnessing the emergence of two distinct human subspecies: the mentally autonomous and the cognitively dependent.

The former use AI to amplify existing capabilities; the latter require it for basic cognitive function.

Mental health chatbots show measurable symptom reduction in clinical trials, but here’s what the studies don’t track: how many users can maintain those improvements without continuous AI support.

We’re not curing depression and anxiety we’re just creating subscription-based mental stability.

The economic implications are also staggering.

Half of humanity now requires access to artificial cognitive services for optimal mental function. Tech companies haven’t just created a new product category; they’ve identified a vast population with a fundamental disability that their products can temporarily mask.

Consider the power dynamics: the cognitively autonomous design and control the AI systems that the cognitively dependent require for basic mental function

We’re not just talking about wealth inequality, we’re looking at a cognitive apartheid.

The most disturbing research finding isn’t about AI addiction or attachment disorders but the growing evidence that AI-assisted cognitive capabilities don’t transfer when the system is removed.

People aren’t learning to think; they’re learning to outsource thinking.

For individuals discovering metacognition through AI, the experience is genuinely transformative. But transformation and development are different things.

A diabetic using insulin isn’t developing pancreatic function they’re managing a chronic condition. Similarly, the cognitively silent using AI aren’t developing inner speech they’re managing cognitive impairment.

The tragedy is that this population genuinely benefits from AI scaffolding.

Access to on-demand cognitive coaching, emotional processing, and metacognitive support represents a massive quality-of-life improvement.

But calling this “human enhancement” obscures a darker reality: we’re providing assistive technology for a cognitive disability that affects half the species.

We’re not witnessing the dawn of human-AI collaboration, we’re watching the cognitive colonization of humanity’s silent majority.

While the inner-speech-enabled integrate AI as a powerful tool, the cognitively impaired are being integrated into AI as necessary components of their own mental function.

The future isn’t humans working with AI, it’s cognitively autonomous humans designing AI systems to manage cognitively dependent humans. We’ve automated thinking itself and created a market for consciousness.

The real question isn’t whether these AI relationships are healthy or unhealthy, it’s whether we’re comfortable with a world where half of humanity requires artificial cognitive support for basic mental function. Because that world isn’t coming. It’s already here.

The silent majority has found its voice. The problem is that voice belongs to big tech.