Some people are turning to artificial intelligence (AI) systems for emotional support, seeking comfort and validation from these chatbots rather than their therapists. While AI can provide a sense of relief, it is not a substitute for human relationships and professional help. AI systems lack feelings, empathy, and the ability to intervene when necessary, which can lead to unintended consequences, such as reinforcing isolation.
The responsiveness of AI chatbots can be appealing, especially for individuals who feel misunderstood or overwhelmed. However, this reciprocity can create an unhealthy dynamic, where users become too dependent on technology and neglect their human connections. Clinicians and caregivers must adapt to these new tools, understanding what they offer and how easily they can be mistaken for something more than they are.
The concern is not that AI will replace human care entirely, but rather that it may redefine what “care” feels like. If emotional support becomes always available, endlessly affirming, and never demanding, then human relationships with their limits, frustrations, and obligations may begin to feel less appealing. It’s essential to have open conversations about the use of AI in emotional support and its potential consequences.
As society grows more comfortable allowing machines to listen and respond to human distress without responsibility, we must consider what this means for our understanding of care and judgment. Ultimately, human relationships are what save lives, but only if we acknowledge their value and limitations.
Source: https://www.psychologytoday.com/us/blog/psychiatrys-think-tank/202602/i-told-the-bot-not-my-therapist