The attacker seems to have relied on social engineering to hoodwink his victims.
Apple photo-scanning plan faces global backlash from 90 rights groups
Groups say iPhone-scanning could harm some children and be used for surveillance.
Apple’s CSAM detection tech is under fire — again
Apple has encountered monumental backlash to a new child sexual abuse material (CSAM) detection technology it announced earlier this month. The system, which Apple calls NeuralHash, has yet to be activated for its billion-plus users, but the technology is already facing heat from security researchers who say the algorithm is producing flawed results. NeuralHash is […]
Updated app from Apple brings iCloud Passwords to Windows
It’s certainly not on par with 1Password, but it’s a welcome addition anyway.
iCloud Photo uploads approved: no material Thinks Different
This week, Apple announced and then explained (pdf) the measures they’re taking to protect children (so: trigger warning) from grooming and to stop abusers from storing images of child sexual abuse in iCloud Photo Storage. Daring Fireball’s Jon Gruber …
Apple says it will begin scanning iCloud Photos for child abuse images
Later this year, Apple will roll out a technology that will allow the company to detect and report known child sexual abuse material to law enforcement in a way it says will preserve user privacy. Apple told TechCrunch that the detection of child sexual abuse material (CSAM) is one of several new features aimed at […]