Apple Csam Scanning Plans Abandoned Regulator Speaks Out
Apple CSAM scanning plans may have been abandoned, but that hasn’t ended the controversy. An Australian regulator has accused the Cupertino company of turning a blind eye to the sexual exploitation of children. She said that both Apple and Microsoft fail to take steps to protect “the most vulnerable from the most predatory” … Background The usual way to detect Child Sexual Abuse Material (CSAM) is when cloud services like Google Photos scan uploaded photos and compare them against a database of known CSAM images....