Apple Confirms Csam Detection Only Applies To Photos Defends Its Method Against Other Solutions
Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos. The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies. Apple confirmed today that, at launch, CSAM detection will only apply to photos that are stored in iCloud Photos, not videos....