https://techcrunch.com/2024/12/08/apple-sued-over-abandoning-csam-detection-for-icloud/
Apple sued over abandoning CSAM detection for iCloud
Anthony Ha
10:26 AM PST · December 8, 2024
*** begin quote ***
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).
The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes Apple as announcing “a widely touted improved design aimed at protecting children,” then failing to “implement those designs or take any measures to detect and limit” this material.
Apple first announced the system in 2021, explaining that it would use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. However, it appeared to abandon those plans after security and privacy advocates suggested they could create a backdoor for government surveillance.
*** end quote ***
In this case, I think the risk of “backdoor for government surveillance” is less valuable than the prevention of Child Abuse. IMHO.
I’d be pushing the tech companies hard in the war against children’s exploitation.
Maybe even their devices (i.e., iPhone, Android phones, tablets) should use digital signatures from the National Center for Missing and Exploited Children.
—30—








