Tuesday, May 6

One of the largest websites dedicated to deepfake pornography announced that it has shut down after a critical service provider withdrew its support, effectively halting the site’s operations. 

Mr. Deepfakes, created in 2018, has been described by researchers as “the most prominent and mainstream marketplace” for deepfake porn of celebrities, as well as individuals with no public presence. On Sunday, the website’s landing page featured a “Shutdown Notice,” saying it would not be relaunching.

The site allowed users to upload and view deepfake porn videos created using artificial intelligence. Forums on the site allowed users to buy and sell custom nonconsensual deepfake content, as well as discuss practices for making deepfakes.  The owner of the site remains unknown. 

The shutdown comes just days after Congress passed the “Take It Down Act,” which makes it a federal crime to post nonconsensual sexual imagery, including explicit deepfakes. The legislation, backed by first lady Melania Trump, requires social media platforms and other websites to remove images and videos within 48 hours after a victim’s request. 

While it’s not clear if the website’s termination was related to the Take It Down Act, it’s the latest step in a crackdown on nonconsensual sexual imagery. 

Henry Ajder, an expert on AI and deepfakes, told CBS News that “this is a moment to celebrate,” describing the website as the “central node” of deepfake abuse. 

Ajder said the issue of nonconsensual deepfake imagery will not go away but disbanding the largest archive of deepfake porn is “a step in the right direction.” He said it makes the content less accessible and scatters the community of users, likely pushing them toward less mainstream platforms such as Telegram. 

“I’m sure those communities will find a home somewhere else but it won’t be this home and I don’t think it’ll be as big and as prominent. And I think that’s critical,” Ajder said.

Ajder said he wants to see more legislation introduced globally and an increase in public awareness to help tackle the issue of nonconsensual sexual deepfake imagery. 

“We’re starting to see people taking it more seriously and we’re starting to see the kind of societal infrastructure needed to react better than we have, but we can never be complacent with how much resource and how much vigilance we need to give,” Ajder said. 

Share.
Leave A Reply

Exit mobile version