Apple returns to its controversial project to detect child pornography photos in users’ iPhones. An initiative finally abandoned at the end of last year.
Last December, Apple quietly ended its project to detect child pornography photos in users’ iCloud library. At the time, the manufacturer simply split a statement sent to the US press, in which the company assured that children could be protected without companies peeling personal data.
Read Apple ends its controversial plan to fight against child pornography
The subject comes up again through an exchange of letters between Apple and the organization Heat Initiative, which wants to push the manufacturer to ” detect, report and remove » from iCloud child pornography content (CSAM for Child Sexual Abuse Material). Erik Neuenschwander, director of privacy and child safety at Apple, explains why the company finally dropped the review of iCloud Photo Libraries.
” Monitoring iCloud data stored locally by each user would create new attack vectors that hackers could exploit “, he specifies before describing the possibility of “ unintended consequences “: ” Searching for one type of content could open the door to mass surveillance by creating an air call to scour other messaging systems for different types of content “.
The detection of CSAM photos as imagined by Apple went through a scan of the local photo library, directly on the iPhone. Basically, the images had to be matched against a database of child pornography photos using AI algorithms. In the event of a match, a second examination was carried out on Apple’s servers, before a manual and human investigation was carried out to rule out false positives.
Apple had announced this novelty in August 2021, obviously without having thought for a second of the consequences that such technology implied for the general surveillance of users. Very quickly, specialists and experts have pointed out the dangers of this systemand it was only after long months of reflection that the company abandoned the idea.
Apple now highlights the set of functions grouped under the name Communication Safety, which detects – always locally on the device – nudity in FaceTime, Messages, AirDrop and the Photos selector. An API allows developers to integrate this technology (Discord is in the process of implementing it).
Read Messages: Apple will deploy its nude photo detection tool in France to protect minors
Neuenschwander concludes by confirming that the hybrid approach for CSAM photo detection could not be integrated into practice, without jeopardizing user security and privacy.
Source :
Wired