もっと詳しく
Illustration of a padlock over a glowing digital data panel.

Enlarge (credit: Getty Images | Yuichiro Chino)

Apple today said it will refuse any government demands to expand its new photo-scanning technology beyond the current plan of using it only to detect CSAM (child sexual abuse material).

Apple has faced days of criticism from security experts, privacy advocates, and privacy-minded users over the plan it announced Thursday, in which iPhones and other Apple devices will scan photos before they are uploaded to iCloud. Many critics pointed out that once the technology is on consumer devices, it won’t be difficult for Apple to expand it beyond the detection of CSAM in response to government demands for broader surveillance. We covered how the program will work in detail in an article Thursday night.

Governments have been pressuring Apple to install backdoors into its end-to-end encryption system for years, and Apple acknowledged that governments are likely to make the exact demands that security experts and privacy advocates have been warning about. In a FAQ released today with the title, “Expanded Protections for Children,” there is a question that asks, “Could governments force Apple to add non-CSAM images to the hash list?”

Read 13 remaining paragraphs | Comments