An explanation for Apple’s controversial decision to begin scanning iPhones for CSAM has been found in a 2020 statement by Apple’s anti-fraud chief. Eric Friedman stated, in so many words, that “we are the greatest platform for distributing child porn.” The revelation does, however, raise the question: How could Apple have known this if it wasn’t scanning iCloud accounts…? 9to5Mac reports: The iMessage thread was spotted by the Verge as it works its way through the internal emails, messages, and other materials handed over by Apple as part of the discovery process in the Epic Games lawsuit. Ironically, Friedman actually suggests that Facebook does a better job of detecting it than Apple did: “The spotlight at Facebook etc. is all on trust and safety (fake accounts, etc). In privacy, they suck. Our priorities are the inverse. Which is why we are the greatest platform for distributing child porn, etc.”
A fellow exec queries this, asking whether it can really be true: “Really? I mean, is there a lot of this in our ecosystem? I thought there were even more opportunities for bad actors on other file sharing systems.” Friedman responds with the single word, “Yes.” The document is unsurprisingly labeled “Highly confidential — attorneys’ eyes only.”
The stunning revelation may well be explained by the fact that iCloud photo storage is on by default, even if it’s just the paltry 5GB the company gives everyone as standard. This means the service may be the most-used cloud service for photos — in contrast to competing ones where users have to opt in. Apple has said that it has been looking at the CSAM problem for some time, and was trying to figure out a privacy-protecting way to detect it. It may well be this specific conversation that led the company to prioritize these efforts.
Read more of this story at Slashdot.