DMR News

Advancing Digital Conversations

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

ByHilary Ong

Dec 9, 2024

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple faces legal action over its decision to abandon a system designed to detect child sexual abuse material (CSAM) in iCloud. The lawsuit, filed in Northern California, accuses Apple of neglecting to implement promised measures to combat the circulation of CSAM, potentially affecting thousands of victims.

The lawsuit, reported by The New York Times, stems from Apple’s 2021 announcement of a CSAM detection tool. This system aimed to use digital signatures from the National Center for Missing and Exploited Children (NCMEC) to identify known CSAM in users’ iCloud libraries. However, Apple halted its rollout after privacy advocates raised concerns about the potential misuse of the technology, including its use as a government surveillance tool.

Filed by a 27-year-old woman under a pseudonym, the lawsuit claims that Apple’s inaction forces victims to endure ongoing trauma as their abuse images remain accessible online. The plaintiff, who was abused as an infant, reports receiving frequent notifications from law enforcement about new cases involving her images. The suit estimates a class of 2,680 victims who could seek compensation, with damages exceeding $1.2 billion.

Attorney James Marsh, representing the plaintiff, highlighted the systemic impact of Apple’s choice, emphasizing the company’s failure to implement its widely promoted child safety features. The lawsuit also references a similar case filed in August, where a 9-year-old girl and her guardian sued Apple for not addressing CSAM on iCloud.

Apple responded, emphasizing its commitment to child safety while maintaining user privacy. Spokesperson Fred Sainz stated, “Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk. We are urgently innovating to combat these crimes without compromising security and privacy.” He also pointed to features like Communication Safety, which alerts children to explicit content as part of broader prevention efforts.

This lawsuit adds to mounting scrutiny of Apple’s approach to addressing CSAM. In a separate critique earlier this year, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused the company of underreporting such material.

Apple’s decision to abandon its CSAM detection plans raises pressing questions about the balance between user privacy and protecting vulnerable individuals. While privacy concerns are valid, the absence of proactive measures leaves victims of child exploitation without crucial safeguards.


Featured Image courtesy of Janis Engel/EyeEm/Getty Images

Follow us for more tech news updates.

Hilary Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *