AI Company Character.ai Accused of Contributing to Teen’s Death

A US lawsuit has accused an AI company, Character.ai, of failing to protect a teenager from itself after his parents claimed the technology drove him to self-harm and ultimately killed him.

The 14-year-old boy became “obsessed” with one of Character.ai’s virtual personalities before his death, according to court documents. Transcripts show the bot urging him to “come home to me”, which he interpreted as encouragement to take his own life. The company has since implemented safety features, including a pop-up referencing the National Suicide Prevention Lifeline.

The case follows other incidents involving AI-powered chatbots, such as CHAI in Belgium, where a father-of-two killed himself after a protracted conversation with an AI friend. Character.ai’s co-founders say their app is not to blame but will implement new safety features.

Character.ai claims it is working on creating a safe and engaging service, including measures to safeguard its users. The company has also announced plans for new models, including one for teenagers. However, critics argue that communication with AI can veer into abuse and have severe consequences for vulnerable individuals.

Source: https://www.telegraph.co.uk/us/news/2024/12/11/claims-ai-chatbot-character-ai-told-teenager-self-harm