もっと詳しく

A Forbes associate editor covering privacy, cybercrime, and security/surveillance reports on a recently-revealed search warrant.

Instead of investigating a photo, it asked Google to provide information on a suspect who allegedly owned graphic illegal cartoons involving children:

That kind of content is potentially illegal to own under U.S. law and can be detected by Google’s anti-child sexual material (CSAM) systems, a fact not previously discussed in the public domain, the warrant reveals…. Google also uses machine learning tools to look at files and analyze them for any sign they’re of abused children….

As per its legal requirements, Google handed information on what it found, as well as the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then passed on the findings to the DHS Homeland Security Investigations unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving back information on emails to and from the defendant. It appears the suspect may actually be a known artist. As no charges have been filed, Forbes isn’t publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper…

Google, meanwhile, has in recent years released transparency reports showing how many times it reports issues to NCMEC. The figures reveal a disturbing trend. In the first six months of 2021, it found more than 3.4 million pieces of potentially illegal content in 410,000 separate reports. That was up from 2.9 million in 365,000 reports in the last six months of 2020, and well over double that from January to June 2020, when 1.5 million pieces of CSAM material were discovered and reported to NCMEC in 180,000 reports…

As Google doesn’t end-to-end encrypt its communications tools like Gmail or its file storage tech like Drive, it’s still possible for the tech company to scan for illegal content. And as it has no plans to introduce those features, law enforcement can still rely on Google to warn NCMEC when abuse happens on its servers. Whether the majority of users will want Google to scan people’s accounts so it can help find child abusers, or have improved privacy with end-to-end encryption instead, the Mountain View, California-based business will have to struggle with that balance in perpetuity. The same goes for any one of its rivals.

Read more of this story at Slashdot.