もっと詳しく

Karen Hao, reporting for MIT Technology Review: The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is the picture and the push of a button. MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before. From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.

Read more of this story at Slashdot.