Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.

Apple will no longer scan for CSAM in iCloud Photos

On the same day that the company announced Advanced Data Protection with end-to-end encryption for all iCloud data, it also put an end to the never-released CSAM scan. The news was confirmed by Apple’s vice president of software engineering Craig Federighi in an interview with WSJ’s Joanna Stern.

When the CSAM scan was announced, Apple said that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of CSAM image hashes. That way, the company would be able to detect such photos using on-device processing without ever having to see users’ photos.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes, without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match.

It turns out that, Apple has now decided that it would be better to put an end to CSAM scanning on iCloud Photos. It’s worth noting, however, that other child safety features such as restrictions on iMessage are still available in iOS.

What are your thoughts on this decision? Let us know in the comments below.