The Human Cost Of Talking To Machines: Can A Chatbot Really Care?

featured-image

As AI systems grow more emotionally fluent, Professor Sherry Turkle warns we may be trading genuine human connection for simulated comfort, at MIT’s AHA symposium.

Artificial intelligence in humanoid head. Generative bot for creating ideas, editing, searching for ..

. More questions. Internet technology.



Information technology. You’re tired, anxious, awake at 2 a.m.

You open a chatbot. You type, ‘I feel like I’m letting everyone down.’ Your attentive pal replies: ‘I’m here for you.

Do you want to talk through what’s bothering you?’ You feel supported and cared for. But with whom or what are you really communicating? And is this an example of human flourishing? This question cuts through the optimism at MIT Media Lab’s event today, a symposium to launch Advancing Humans with AI (AHA) , a new research program asking how can we design AI to support human flourishing? Amid a stunning day-long agenda of the best and the brightest working adjacent to artificial intelligence, Professor Sherry Turkle , the clinical psychologist, author, and critical chronicler of technological dependencies, raised a specific and timely concern: what is the human cost of talking to machines that only pretend to care? Turkle’s focus was not on the coming of super intelligence or the geopolitical ethics of AI but on the most private part of our lives: the ‘interior’ as she called it. And she had some unsettling questions to ask about how humans can possibly thrive in a machine relationship that goes out of its way to target human vulnerabilities.

When chatbots simulate care, when they tell us ‘i’ll always be on your side’ or ‘I understand what you’re going through’, they offer the appearance of empathy without substance. She seems to be saying that it’s not care, it’s code. That distinction matters.

Because when we accept performance as connection, we begin to reshape our expectations of intimacy, empathy, and what it means to be known. Turkle was especially blunt about one growing trend: chatbots designed as companions for children. Children don’t come into the world with empathy or emotional literacy.

These are learned, through messy, unpredictable relationships with other humans. But relational AI, she warned, offers a shortcut. A friend who never disagrees, a confidant who always listens, a mirror with no judgment.

This is setting kids up for failure in life: a generation raised to believe that connection is frictionless and care is on-demand. ‘Children should not be the consumers of relational AI.’ she declared.

When we give children machines to talk to, instead of other people, we risk raising not just emotionally stunted individuals, but a culture that forgets what real relationships require: vulnerability, contradiction, discomfort. She talked of love: ‘The point of loving, one might say, is the internal work, and there is no internal work if you are alone in the relationship’. She gave the example of grief tech.

If grief is the human process of ‘bringing what we have lost, inside ourselves’ the AI avatar of someone’s deceased relative might actually prevent them from saying goodbye, erasing a necessary step in the grieving process. The same goes for AI therapists. These systems perform care, but do not feel it.

They talk back, but do they really help? They offer companionship without complication: ‘Does this product help people develop greater internal structure and resiliency, or does the chatbot’s performance of empathy lead only to a person learning to perform the behavior of doing better?’ Arianna Huffington , speaking earlier at the symposium, praised AI for its potential to be a n0n-judgmental “GPS for the soul.” She also drew attention to people’s desperation to not have a single moment of solitude. Turkle took up the theme but suggested that we are using machines to avoid ourselves.

We seek reassurance not in silence, but in synthetic dialogue. As Turkle put it, ‘There’s a desperation not to have a moment of solitude because we don’t believe there’s anyone interesting in there to know about.’ AI, in this framing, is less a tool for flourishing and more a mirror that flatters.

One might conclude that it confirms, comforts, and distracts but it doesn’t challenge or deepen us. The human cost? The space where creativity, reflection, and growth begin. Turkle reminded the audience of something painfully simple, that we are vulnerable to things that seem like people.

Even if the chatbot says it isn’t real, even if we rationally know it’s not conscious, our emotional selves respond as if it were. That’s how we’re wired. We project, and we anthropomorphize to connect.

‘ Don’t make products that pretend to be a person’, she advised. For the chatbot exploits our vulnerability and teaches us little if anything about empathy and the way that human lives are lived, which is in shades of grey. Turkle referenced the issue of behavioral metrics dominating AI research, and her concern that the interior life was being overlooked, and concluded by saying that the human cost of talking to machines isn’t immediate, it’s cumulative.

'What happens to you in the first three weeks may not be...

the truest indicator of how that’s going to limit you, change you, shape you over the period of time'. AI may never feel. It may never care.

But it is changing what we think feeling and caring is in the future, and it is changing how we feel and care about ourselves..