もっと詳しく

According to 9to5Mac, Apple has confirmed that it’s already been scanning iCloud Mail for Child Sexual Abuse Material (CSAM), and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups, which sent the internet into a frenzy when it announced its intents to begin doing so. From the report: The clarification followed me querying a rather odd statement by the company’s anti-fraud chief [Eric Friedman]: that Apple was “the greatest platform for distributing child porn.” That immediately raised the question: If the company wasn’t scanning iCloud photos, how could it know this? […] Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task. Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.

Although Friedman’s statement sounds definitive — like it’s based on hard data — it’s now looking likely that it wasn’t. It’s our understanding that the total number of reports Apple makes to CSAM each year is measured in the hundreds, meaning that email scanning would not provide any kind of evidence of a large-scale problem on Apple servers. The explanation probably lays in the fact that other cloud services were scanning photos for CSAM, and Apple wasn’t. If other services were disabling accounts for uploading CSAM, and iCloud Photos wasn’t (because the company wasn’t scanning there), then the logical inference would be that more CSAM exists on Apple’s platform than anywhere else. Friedman was probably doing nothing more than reaching that conclusion.

Read more of this story at Slashdot.