Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.
The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.
Apple confirmed today that, at launch, CSAM detection will only apply to photos that are stored in iCloud Photos, not videos. Given the proliferation of video in CSAM content, however, the company acknowledged that there is more it could do in the future, and it can expand and evolve its plans over time.
This makes sense when you step back and look at how Apple’s scanning for CSAM detection works. All of the matching is done on device, with Apple transforming a database of photos from the National Center for Missing and Exploited Children into an “unreadable set of hashes that is securely stored on users’ devices.” The on-device database is then checked against photos, and there is an on-device match, the device then creates a cryptographic safety voucher that encodes the match result.
Apple is also reinforcing that if a user does not use iCloud Photos, then no part of the CSAM detection process runs. This means that if a user wants to opt out of the CSAM detection process, they can disable iCloud Photos.
Finally, Apple is also doubling down on why it believes its on-device implementation of CSAM detection is far better than the server-side implementations used by other companies. Those implementations, Apple explains, require that a company scan every single photo stored by a user on its server, the majority of which are not CSAM.
Apple’s implementation of the CSAM detection does not require that Apple servers scan every photo. By moving the process on device, Apple’s method is more secure and is designed to only check the hashes of the images against the NCMEC database of images of known CSAM, as opposed to server-side scanning of all images.