The new feature is suspected of violating privacy, Apple responded: Don’t misunderstand!
Author | Gao Xiusong
Recently, Apple published an FAQ titled "Extended Child Protection" aimed at alleviating users' privacy concerns about new CSAM detection in iCloud Photos and communication security of the Messages feature the company announced last week.
“Since we announced these features, many stakeholders, including privacy and child safety groups, have expressed their support for this new solution, and some have raised questions,” the FAQ reads. “This document is intended to address those questions and provide more clarity and transparency in the process.”
"Some discussions have blurred the distinction between the two features," Apple said in the document, saying that communication security in Messages "applies only to images sent or received in the Messages app on a child account set up in Family Sharing," while CSAM detection in iCloud Photos "only affects users who have chosen to use iCloud Photos to store their photos...and does not affect data on any other devices."
Apple's decision to deploy the technology in iOS 15 and iPadOS 15, expected to be released in September, has been heavily criticized by privacy advocates, security researchers, cryptography experts, academics and others.
Will Cathcart, chief executive of Facebook-owned WhatsApp, called it "the wrong approach and a setback for the privacy of people around the world".
Epic Games CEO Tim Sweeney also slammed the decision, claiming he "struggled" to look at the move from Apple's perspective but concluded that "inevitably this was government spyware installed by Apple based on the presumption of guilt."
“Whatever their good intentions, Apple is rolling out mass surveillance to the world,” said renowned whistleblower Edward Snowden. “If they can scan for child pornography today, they can scan for anything tomorrow.”
The nonprofit Electronic Frontier Foundation also criticized Apple's plan, calling it "a well-documented, well-thought-out, and narrow backdoor."
In addition to this, an open letter criticizing Apple's plan to scan CSAM and explicit images in children's messages in "iCloud Photos" on iPhones has received more than 5,500 signatures.