A lawsuit has been filed against Apple, alleging that the company’s decision not to implement a system to scan iCloud photos for child sexual abuse material (CSAM) is forcing victims of abuse to relive their trauma. The 27-year-old plaintiff claims that her relative molested her as an infant and shared images of her online, leading to ongoing law enforcement notices about others being charged over possessing those images.
In 2021, Apple announced plans to use digital signatures from the National Center for Missing and Exploited Children and other groups to detect CSAM content in iCloud libraries. However, security and privacy advocates expressed concerns that this system could create a backdoor for government surveillance.
The lawsuit argues that Apple’s failure to implement the CSAM detection system has left victims vulnerable to further abuse. Attorney James Marsh estimates that there may be 2,680 potential victims who could be entitled to compensation in this case.
Source: https://techcrunch.com/2024/12/08/apple-sued-over-abandoning-csam-detection-for-icloud