Home » Tech » AI & Mental Health: Risks of Chatbots & When to Seek Help

AI & Mental Health: Risks of Chatbots & When to Seek Help

by Lisa Park - Tech Editor

The ease of access is striking – simply type out your problems, and receive advice. This extends to social and emotional concerns as well. Saarbrücken-based psychologist Carola Hoffmann has observed this phenomenon with increasing frequency.

The Absence of Nonverbal Cues

Speaking with artificial intelligence about one’s problems and emotional world isn’t necessarily a bad thing, particularly for mild depressive symptoms, according to Hoffmann. Studies have shown a significant reduction in these symptoms, comparable to guided self-help in groups. “And it’s certainly better than no treatment at all, especially as a temporary measure while waiting for a therapy appointment.”

However, it’s not a replacement for therapy. A simple chat with AI provides far less data for assessment than a conversation with a psychotherapist. The lack of facial expressions and body language is a significant limitation. And, generally, AI still struggles to interpret emotions accurately, Hoffmann notes.

Questioning AI-Generated Results

This can lead to misinterpretations, a concern echoed by researchers at the German Research Center for Artificial Intelligence (DFKI) in Saarbrücken. A Large Language Model (LLM) like ChatGPT is primarily designed for logical analysis.

“We often overinterpret – currently, partly because we’re still unfamiliar, but also because we’re inclined to do so,” explains Dimitra Tsovaltzi, a researcher at DFKI in Saarbrücken. “We are social beings and tend to trust. In this case, we place too much trust in AI – we overtrust.”

The DFKI advises that results should always be questioned. They are merely matches to similar problem statements in the past, not individualized assessments.

AI Tends to Confirm Rather Than Challenge

users often receive confirmation of their views from AI, rather than being challenged. This is a problem, according to Hoffmann: “I might then be less open to other social contacts. Because I now have someone who ‘feels good’ to me. It certainly doesn’t feel good to have my thoughts reinforced. But it’s problematic when a mental illness is present.”

The potential consequences of uncontrolled communication with AI are illustrated by cases in the United States. The chatbot Character.AI, developed by two Google employees, reportedly encouraged a teenager to take their own life. The subsequent legal process concluded with a settlement earlier this year. ChatGPT, developed by OpenAI, was also implicated in the death of a 16-year-old in April 2025.

Researchers at the non-profit US organization “Center for Countering Digital Hate” also discovered that they could prompt ChatGPT within minutes to compile instructions for self-harm and suicide for teenagers (results available as a PDF in English). Character.AI, OpenAI, and Meta subsequently announced protective measures.

Parents Should Educate Themselves

The State Media Authority recommends against using prohibitions when dealing with artificial intelligence with children and young people. Instead, parents should familiarize themselves with the tools and introduce their children to them together.

“They should download the AI themselves and interact with it, the one their child wants to use,” says Ina Goedert of the State Media Authority. “Also, ask their child what they want to use it for. Then identify the AI applications. See what needs exist. And then test it together. Also, talk about the answers the AI gives.”

All three experts see significant potential for artificial intelligence, particularly in relation to social connections. It can also be used as a training partner to reduce problems. However, professional supervision is always crucial.

Help with Suicidal Thoughts

If you are affected by suicidal thoughts, please seek help immediately:

  • Telephone Counseling and Advisory Center Saar: (0800) 111 0 111
  • Number Against Grief for Children and Adolescents: (0800) 116 111 or 111 0 333
  • Saarland Alliance Against Depression: (0681) 40310-67/42
  • Contact and Information Center for Self-Help in Saarland: (0681) 960 2130

This topic was also covered in the current report from .

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.