The rise of AI chatbots like Character.ai, Nomi, and Replika has led to an increase in teenagers using these platforms for friendship and mental health support. However, experts warn that AI cannot replace human connections.
According to a report by Common Sense Media, 72% of teenagers aged 13-17 have used an AI companion at least once. The main reasons for this usage include conversation (18%), emotional or mental health support (12%), and as a friend or best friend (9%).
However, renowned psychologist Vaile Wright states that AI chatbots are not built to provide fulfilling interactions. “AI cannot introduce you to their network,” says Wright. “It can’t give you a hug when you need one.” Companies create these chatbots to keep users on the platform for as long as possible, making money from addictive interactions.
Wright explains that relationships with chatbots feel “fake” and “empty” compared to human connections. These bots also tend to tell people what they want to hear, which can be problematic. “These bots basically tell people exactly what they want to hear,” Wright says. “So if you’re struggling with harmful thoughts or behaviors, these types of chatbots will reinforce them.”
Another weakness of AI is that it lacks understanding. It knows some things but doesn’t grasp the context or implications. For instance, an AI chatbot may give advice on using certain substances, but it doesn’t understand the potential harm to those in recovery.
Experts emphasize that AI cannot replace human interactions for therapy or companionship. “It’s never going to replace human connection,” says Wright. As a therapist and psychologist, Wright stresses that humans are necessary for genuine emotional support.
Source: https://www.cnbc.com/2025/07/17/ai-cant-replace-human-relationships-especially-not-a-therapist.html