Select Language:
Almost half of young Europeans have used AI chatbots to discuss personal or sensitive topics, according to a survey by Ipsos BVA released on Tuesday. Out of 3,800 respondents aged 11 to 25 in France, Germany, Sweden, and Ireland, more than half (51%) found it easy to talk about mental health and personal issues with a chatbot. In contrast, only 49% felt comfortable doing so with healthcare professionals, and just 37% with psychologists.
The most trusted confidants were friends, with 68% saying it was easy to open up to them, followed by parents at 61%. The survey underscored growing concerns about the mental health of young people, as approximately 28% of respondents screened positive for suspected generalized anxiety disorder.
A striking 90% of participants had previously used AI tools, appreciating their constant availability and non-judgmental approach. Over 60% viewed AI as a counselor or trusted advisor. Nonetheless, experts have raised alarm over the psychological effects of these tools, emphasizing AI’s limitations in correctly interpreting human emotions or providing safe emotional support.
Earlier this year, a lawsuit was filed against Google by the family of a Florida man, alleging that its Gemini AI chatbot played a role in exacerbating his paranoia and contributing to his suicide.
Psychologist and digital health researcher Ludwig Franke Föyen from Stockholm’s Karolinska Institute considers these findings unsurprising. He explained that today’s advanced language models generate responses that are virtually indistinguishable from those of human experts, even for trained professionals.
However, Franke Föyen cautioned against relying solely on chatbots for mental health support, noting that general-purpose AI systems are primarily designed to engage users and that their goals may not align with patient care. He stressed the importance of human relationships and professional intervention, warning that depending on AI alone could make young people feel more isolated.
“AI can provide information and support, but it shouldn’t replace genuine human connections or professional guidance,” he said. “If someone turns to a chatbot instead of talking to a parent, friend, or mental health expert, that’s a real concern. We want technology to complement, not replace, human interaction.”




