“Deepfakes” are oftentimes pretty funny memes where one person’s face is seamlessly superimposed onto a video of another’s. Examples include Donald Trump as Dr. Evil and Nic Cage as Yoda, among many others.

Here, for example, are some Nic Cage deepfakes (a particular favorite of the deepfaking community):

“It is not a scandal. It is a sex crime”

Recently, however, these AI-generated videos have gained notoriety for a more sinister purpose: revenge porn. As per a recent report from Forbes, “People on Reddit are already asking about using photos of ex-girlfriends to map their faces to porn clips.” Of course you have, internet.

Daisy Ridley’s face on adult film star’s body.

In response, Pornhub–that gleaming symbol of impeccable ethics and moral righteousness–has declared the usage of someone else’s face, unauthorized, in a “deepfake,” as a work of “nonconsensual content.” As such, it “directly violates” their terms of service, meaning all deepfake content will be removed “as soon as [they] are made aware of it.” Users can do so by “flagging” the offending content.

In response, Reddit, which hosts a “Deepfake” subreddit with over 15,000 members, has followed suit–along with Twitter–deeming the works nonconsensual and prohibiting their posting. Attempting to visit the /r/deepfakes page brings one here:

The actual process of making a deepfake is rather easy but time consuming. It relies on AI and neural networks, but the uninitiated can get started just by downloading FakeApp, a “user-friendly” application built by reddit user “deepfakeapp.” In an interview with Vice’s Motherboard, he told Samantha Cole that his goal was to “improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.” While it requires only a couple of clicks from the user, the process can take around 8 to 12 hours to complete.

What is most notable about Pornhub’s response to deepfakes–besides their banning them–is their decision to couch the ban in terms of consent. While some commenters have taken issue with this, such as one on Gizmodo who wrote, in part: “I can see how it’s creepy and could potentially be or lead to harassment, but I am not seeing how ‘non-consensual’ is being easily thrown around here,” for others it is a refreshing acknowledgment of the reality of online abuse. In the words of one respondent: “You can’t hide sex crimes behind ‘becuz intrnetz’ anymore dude.”

“Yeah dude.” [Image courtesy Carlos Quintero]

As Ian Morris at Forbes points out discussing the rise of deepfakes, this isn’t the first time we’ve found laws surrounding digital invasions of privacy to be woefully inadequate. When Jennifer Lawrence, for example, had nude images stolen from her computer, she complained of the media’s popular characterization of what happened to her as a “scandal.” She said:

“It is not a scandal. It is a sex crime, it is a sexual violation. It’s disgusting. The law needs to be changed, and we need to change.”

[Image Courtesy Gage Skidmore]

While I wouldn’t hold your breath on our collectively “changing” anytime soon, it’s nice to see companies finally addressing the real harms–along with their complicity in them–that can occur in the digital sphere.

 

Feature Image Courtesy Kat Love @ Unsplash