Home iOS Apple says it won’t expand controversial CSAM technology

Apple says it won’t expand controversial CSAM technology

229
0

Apple has tried to deflect criticism of its controversial CSAM protection system, but in doing so has illustrated just what’s at stake.

The big conversation

Apple last week announced it would introduce a collection of child protection measures inside iOS 15, iPad OS 15 and macOS Monterey when the operating systems ship this fall.

Among other protections, the on-device system scans your iCloud Photos library for evidence of illegal collections of Child Sexual Abuse Material (CSAM). It is, of course, completely appropriate to protect children, but privacy advocates remain concerned about the potential for Apple’s system to become full-fledged surveillance.

In an attempt to mitigate criticism, Apple has published fresh information in which it tries to explain a little more about how the tech works. As explained in this Apple white paper, the tech turns images on your device into a numeric hash that can be compared to a database of known CSAM images as you upload them to iCloud Photos.

Making a hash of it

While the analysis of the image takes place on the device using Apple’s hash technology, not every image is flagged or scanned – just those identified as CSAM. Apple argues this is actually an improvement in that the company at no point scans the entire library.

“Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users,” the company’s new FAQ says. “CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM.”

Copyright © 2021 IDG Communications, Inc.