もっと詳しく
Apple executive Craig Federighi speaking on stage at an Apple conference in 2018.

Enlarge / Apple executive Craig Federighi speaks during the 2018 Apple Worldwide Developer Conference (WWDC) in San Jose, California. (credit: Getty Images | Justin Sullivan)

Apple’s decision to have iPhones and other Apple devices scan photos for child sexual abuse material (CSAM) has sparked criticism from security experts and privacy advocates—and from some Apple employees. But Apple believes its new system is an advancement in privacy that will “enabl[e] a more private world,” according to Craig Federighi, the company’s senior VP of software engineering.

Federighi defended the new system in an interview with The Wall Street Journal, saying that Apple is aiming to detect child sexual abuse photos in a way that protects user privacy more than other, more invasive scanning systems. The Journal wrote today:

While Apple’s new efforts have drawn praise from some, the company has also received criticism. An executive at Facebook Inc.’s WhatsApp messaging service and others, including Edward Snowden, have called Apple’s approach bad for privacy. The overarching concern is whether Apple can use software that identifies illegal material without the system being taken advantage of by others, such as governments, pushing for more private information—a suggestion Apple strongly denies and Mr. Federighi said will be protected against by “multiple levels of auditability.”

“We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world,” Mr. Federighi said.

In a video of the interview, Federighi said, “[W]hat we’re doing is we’re finding illegal images of child pornography stored in iCloud. If you look at any other cloud service, they currently are scanning photos by looking at every single photo in the cloud and analyzing it. We wanted to be able to spot such photos in the cloud without looking at people’s photos and came up with an architecture to do this.” The Apple system is not a “backdoor” that breaks encryption and is “much more private than anything that’s been done in this area before,” he said. Apple developed the architecture for identifying photos “in the most privacy-protecting way we can imagine and in the most auditable and verifiable way possible,” he said.

Read 30 remaining paragraphs | Comments