Apple remains silent on known CSAM detection projects stored in iCloud Photos

0

It’s been over a year now since Apple announced plans for three new child safety features, including a system to detect known child sexual abuse images (CSAM) stored in iCloud Photos, a option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. These last two features are now available, but Apple remains silent on its CSAM detection plans.

Apple originally said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company eventually postponed the feature based on “customer feedback, advocacy groups, researchers and others”.

In September 2021, Apple released the following update to its Child Safety:

Previously, we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time over the next few months to gather feedback and make improvements before releasing these child safety features. of crucial importance.

In December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The edge that Apple’s plans for the feature had not changed. To our knowledge, however, Apple has not publicly commented on the plans since then.

We contacted Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple made progress in implementing its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it expanded the functionality of the app. Messages app to Australia, Canada, New Zealand and UK with iOS. 15.5 and other software releases in May 2022.

Apple said its CSAM detection system was “designed with user privacy in mind.” The system would perform “on-device matching using a database of known CSAM image hashes” from child safety organizations, which Apple would transform into an “unreadable set of securely stored hashes”. security on users’ devices”.

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that works in conjunction with US law enforcement. Apple said there would be a “threshold” that would ensure “less than a one in a trillion chance per year” of an account being misreported by the system, as well as manual review of accounts reported by a human .

Apple’s plans have been criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, political groups, academic researchers, and even some employees. from Apple.

Some critics have argued that Apple’s child safety features could create a “backdoor” into devices, which governments or law enforcement could use to monitor users. Another concern was about false positives, including the possibility of someone intentionally adding CSAM images to someone else’s iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread can be found in our New Policies forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Share.

About Author

Comments are closed.