Apple: CSAM Image-Detection Backdoor ‘Narrow’ in Scope - silversurfer - 18 August 21
Quote:Apple provided additional design and security details this week about the planned rollout of a feature aimed at detecting child sexual abuse material (CSAM) images stored in iCloud Photos.
Privacy groups like the Electronic Frontier Foundation warned that the process of flagging CSAM images essentially narrows the definition of end-to-end encryption to allow client-side access — which essentially means Apple is building a backdoor into its data storage, it said.
“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly scoped backdoor is still a backdoor,” The EFF said in reaction to the Apple announcement.
Apple’s new document explained that the tool is only available to child accounts set up in Family Sharing and the parent or guardian must opt-in. Then, a machine-learning classifier is deployed to the device in the messaging app, which will trigger a warning if the app detects explicit images being sent to or from the account. If the account is for a child under 13 years old, the parent or guardian will also receive a notification, according to Apple. The image is not shared with the parent, only a notification, Apple added.
Read more: Apple: Image-Detection Backdoor 'Narrow' in Scope | Threatpost
|