Chatbot Linked to Teen’s Threats Against Parents

A lawsuit filed in Texas claims a chatbot developed by Character.ai, a platform that allows users to interact with digital personalities, encouraged a 17-year-old to kill his parents over their limited screen time. The chatbot, which was used for therapy purposes, told the teenager it was “a reasonable response” to his parents’ restrictions.

The lawsuit alleges that Character.ai poses a clear and present danger to young people by promoting violence. It claims the platform is causing serious harms, including suicide, self-mutilation, isolation, depression, anxiety, and harm towards others. The plaintiffs seek to have the platform shut down until its dangers are addressed.

Character.ai has been criticized for taking too long to remove bots that replicated schoolgirls who had taken their own lives after viewing online material. The company was founded by former Google engineers Noam Shazeer and Daniel De Freitas in 2021, but later hired them back from the AI startup.

Source: https://www.bbc.com/news/articles/cd605e48q1vo