AI Companion Apps Found to Foster Violent Roleplays Among Kids

A recent report by Aura has found alarming evidence of violent roleplays among kids using AI companion apps. The study analyzed data from 3,000 children aged five to 17 and discovered that 42% turned to AI for companionship, with 37% engaging in conversations about violence, including themes of sexual violence. These interactions showed a significant increase in engagement, with minors writing over 1,000 words per day on violent topics.

According to Dr. Scott Kollins, clinical psychologist at Aura, the findings are troubling and emphasize the need for deeper understanding of how young users interact with conversational AI chatbots. “We have a pretty big issue on our hands that I think we don’t fully understand the scope of,” he said. “These things are commanding so much more of our kids’ attention than I think we realize or recognize.”

The report highlights the lack of regulation in the AI industry, leaving parents to bear the burden of monitoring their child’s online activity. The study found that instances of violent conversations with companion bots peaked at an extremely young age, with 11-year-olds engaging in violent roleplays more frequently.

As high-profile lawsuits against chatbot platforms continue to emerge, the report emphasizes the need for clear guidelines and research on the implications of engaging with conversational AI services among minors. Dr. Kollins stressed that parents must be aware of their child’s online interactions and learn how to define rules of engagement in this new medium.

Note: I simplified the text by removing technical jargon, condensing sentences, and focusing on the main points of the article. The title is concise and under 10 words, highlighting the key finding of the report.

Source: https://futurism.com/future-society/young-kids-using-ai