Apple’s recent announcement about changes rolling out later this year—Apple products used by U.S. customers will be scanned for sexually explicit images of children—has triggered a variety of reactions, ranging from praise to outrage to concern about the future privacy of Apple products.
The first announced change relates to searches on Apple products. If a user uses an Apple product to search for topics related to sexually explicit photos of children, Apple will direct the person to resources for reporting or dealing with child sexual abuse.
The second change involves flagging sexually explicit pictures for users under 18 in Messages. If a child age 13–17 receives or sends a photo or video that includes a sexually explicit image, the Message app will blur the image and provide a variety of warnings and options before the child sends or opens the photo or video. In addition, if a parent of a child 12 or under opts in and the child’s phone is part of a Family Sharing plan, the parent can enable the phone to alert the parent if the child views or sends a sexually explicit picture in Messages (there is no option for an alert to be sent to the parent of a child between 13 and 17).
The third feature will scan photos uploaded to iCloud for child sexual abuse material. The scan is against a database of known sexually explicit material, and if such material is found, it will be reported to an Apple moderator who will review it and decide whether to pass it on to the National Center for Missing and Exploited Children, which works with law enforcement.
After the announcement, some praised Apple for the changes, asserting that Apple’s actions will increase the detection and the probability that an individual who owns or traffics in sexually explicit pictures of children will be found. And, they argue, if you are not someone who falls into the category of people who view or traffic in child pornography, you will experience little or no loss of privacy based on these new changes. In addition, they point out that Apple explained that the scanning feature does not scan photos on your Apple device; it only scans photos uploaded to iCloud, so it is not as intrusive as some might argue.
On the other hand, others, including some experts and advocacy groups, argue that while the announced changes are well-intentioned, they will open a backdoor that threatens to undermine privacy protections for all users of Apple products. For example, they argue, the iCloud scan will create a surveillance system that could provide a means for breaking end-to-end encryption (which makes data unreadable to anyone except for the sender and receiver) or even worse, they argue, open the door to more intrusive invasions of privacy down the road.
Apple has responded by saying that the criticisms are largely based on misunderstandings of what Apple is doing and how the new systems will work. In fact, Apple points out that its new iCloud scanning process is less intrusive than those already deployed and used on other storage platforms. In Apple’s view, the new features are appropriately limited to protect the privacy of Apple users, and argues, it waited to implement such system changes until it had figured out how to make them work while also protecting Apple users’ privacy. To further clarify, Apple has provided additional information about the new features, including technical summaries, proofs, and independent assessments by cryptography and machine-learning experts.