New Jersey AG Sues Discord Over Alleged Misleading Child Safety Features

Discord, a popular gaming and messaging app, is facing a lawsuit from the New Jersey attorney general over allegations that the company misled consumers about child safety features on its platform. The lawsuit claims that Discord employed ambiguous safety settings to create a false sense of security for parents and children.

The complaint alleges that Discord’s age-verification process is flawed, allowing children under 13 to lie about their age and bypass the app’s minimum age requirement. It also claims that the company misrepresented its “Safe Direct Messaging” feature, which was supposed to automatically scan and delete explicit content but did not.

Instead, direct messages between friends were not scanned at all, even when the safety filter was enabled, leaving children exposed to harmful content such as child sexual abuse material and videos depicting violence or terror. The New Jersey attorney general is seeking unspecified civil penalties against Discord as part of the lawsuit.

This latest lawsuit is the latest in a series of lawsuits brought by state attorneys general around the country against social media companies over concerns about child safety and exploitation on their platforms.

Source: https://www.cnbc.com/2025/04/17/discord-sued-by-new-jersey-over-child-safety-features.html