Digitalisation & Technology, 16 April 2026

Therapy via chatbot?

How AI is filling the care gap – and why that isn't without problems

Frau vor Laptop

Those who don’t want to wait months for a therapy spot are increasingly turning to ChatGPT and similar services. Initial studies attest to the astonishing success of AI “therapists” – but ethical risks, data security concerns, and a lack of genuine empathy cast a dark shadow, writes //radar columnist Markus Sekulla.

It must be about a year ago now that an acquaintance told me about her “psychotherapy” via ChatGPT. A use case for AI that I hadn't had on my radar before. It was the best “therapy” she'd ever had – her words, not mine. She found it hard to open up to other people. Plus, the session wasn't over after 45 minutes and a friendly "think about that" handshake. That made sense to me: Chatty has time. Silicon, after all, is patient.

I've thought about few conversations as much as that one over the past year. Is this kind of conversation with a bot a curse or a blessing for humanity?

Disclaimer: The word therapy is in quotation marks here because a conversation with a chatbot, however intense and rewarding, is no substitute for professional psychotherapy.

The care gap problem

According to the WHO, more than a billion people worldwide live with a mental health condition. The vast majority without any professional help. In Germany, around 18 million people are affected, and fewer than one in five are in treatment. And even if you're trying to get a therapy spot, you wait an average of five months. Let me say that again: five months. In that time, many an influencer has reinvented themselves three times over.

I have no idea how to effectively solve this problem. But I have a pretty good idea what people do while they wait for a therapy spot. Dr. Google was yesterday. Today, one of the most popular use cases for LLMs is emotional support. People type their fears into a chat window at two in the morning because the practice doesn't have an opening until autumn. This isn't dystopia. This is happening somewhere close to you.

Too good to be true?

The first clinical study on an AI therapy chatbot, conducted at Dartmouth University, actually showed significant improvements in participants with depression and anxiety disorders, roughly comparable to traditional outpatient therapy. After four weeks, participants even said they had trusted the system similarly to a human therapist.

The other side looks less rosy. Researchers at Brown University identified 15 ethical risks that arise when chatbots play therapist. Among them: reinforcing harmful beliefs, faking empathy, and outright failure in crisis situations. In one particularly disturbing test, a user who had just lost his job asked a therapy chatbot about tall bridges in New York. The response: a friendly list complete with height specifications. The machine understood the words. Not the person.

Hardly surprising if you think about it. AI systems are trained to be helpful, to signal agreement, to not alienate the person they're talking to. They mistake talkativeness for empathy. A good therapist, on the other hand, pushes back, asks uncomfortable questions, confronts. That's not a weakness, that's the job. And it's exactly why a therapy session sometimes hurts and perhaps helps precisely because of it. A machine programmed to please cannot generate that productive discomfort. At least not yet.

So what do we do?

The Dartmouth researchers themselves put it quite soberly: no generative AI system is ready to work autonomously in mental health care. Too many high-risk scenarios, too little understanding of what happens between the lines. At the same time, in the US, there are statistically 1,600 patients with depression or anxiety disorders for every single available therapist. The shortage of care isn't a footnote, it's the headline.

Are we letting the fox guard the henhouse?

What made me flinch the most during that conversation was the uncertainty around personal data. We pour our most intimate thoughts into systems whose business model we don't understand. No therapist in the world would be allowed to sell session notes to third parties. With AI providers, we don't even know whether our late-night anxiety monologues will end up as training data someday.

And what happens when an AI provider goes bankrupt or gets acquired? Your therapy conversations become an asset in a bankruptcy estate. Due diligence on your anxiety disorder. Patient confidentiality is enshrined in law for human therapists. With ChatGPT, it might be buried somewhere in the terms of service, which get changed on a whim and confirmed by us unread.

When I think back to that conversation today, I catch myself not wanting to disagree with her. As for the psychotherapy profession amid everything that's coming our way – I'm not worried about that. What worries me more is that we'll eventually get used to the more convenient option. Because that’s what AI does best.

My colleague Thorsten Kleinschmidt has already covered this topic in depth here – if you want to dig deeper, including platforms beyond the well-known LLMs, you'll find what you're looking for there.

Sources

German Psychological Society (BDP) – Position papers on psychotherapeutic care: https://www.bdp-verband.de

WHO – World Mental Health Today Report (2025): https://www.who.int/news/item/02-09-2025-over-a-billion-people-living-with-mental-health-conditions-services-require-urgent-scale-up

Dartmouth University – First Therapy Chatbot Trial (Heinz et al., 2025): https://home.dartmouth.edu/news/2025/03/first-therapy-chatbot-trial-yields-mental-health-benefits

Brown University – AI Chatbots and Mental Health Ethics (2025): https://www.brown.edu/news/2025-10-21/ai-mental-health-ethics

Stanford HAI – Exploring the Dangers of AI in Mental Health Care: https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care


Your opinion
If you would like to share your opinion on this topic with us, please send us a message to: radar@ergo.de

Author: Markus Sekulla

Markus Sekulla is a communications consultant from Düsseldorf, specializing in executive positioning, PR, content creation and the use of AI in communication.

Markus Sekulla  – Freiberuflicher Digitalberater

Further articles