The tech giant’s new system for scanning iPhones in the US could enable the massive expansion of state surveillance
Last week, Apple announced two backdoors in the US into the encryption that protects its devices. One will monitor iMessages: if any photos sent by or to under-13s seem to contain nudity, the user may be challenged and their parents may be informed. The second will see Apple scan all the images on a phone’s camera roll and if they’re similar to known sex-abuse images flag them as suspect. If enough suspect images are backed up to an iCloud account, they’ll be decrypted and inspected. If Apple thinks they’re illegal, the user will be reported to the relevant authorities.
Action on the circulation of child sexual abuse imagery is long overdue. Effective mechanisms to prevent the sharing of images and the robust prosecution of perpetrators should both receive the political priority they deserve. But Apple’s proposed measures fail to tackle the problem – and provide the architecture for massive expansion of state surveillance.