Comforting claims have circulated in recent days that there is nothing to fear from deepfakes. We profoundly disagree.
Danielle Citron is a Professor of Law at Boston University School of Law and a 2019 MacArthur Fellow. She is the author of Hate Crimes in Cyberspace (Harvard University Press 2014).
Subscribe to this Lawfare contributor via RSS.
The good news is that Facebook is finally taking action against deepfakes. The bad news is that the platform’s new policy does not go far enough.
In the summer of 2016, a meme began to circulate on the fringes of the right-wing internet: the notion that presidential candidate Hillary Clinton was seriously ill. Clinton suffered from Parkinson’s disease, a brain tumor and seizures, among other things, argued Infowars contributor Paul Joseph Watson in a YouTube video. The meme (and allegations) were entirely unfounded.
On Thursday, Sept. 6, Twitter permanently banned the right-wing provocateur Alex Jones and his conspiracy theorist website Infowars from its platform. This was something of the final blow to Jones’s online presence: Facebook, Apple and Youtube, among others, blocked Jones from using their services in early August. Cut off from Twitter as well, he is now severely limited in his ability to spread his conspiracy theories to a mainstream audience.
Back in February, we joined forces in this post to draw attention to the wide array of dangers to individuals and to society posed by advances in “deepfake” technology (that is, the capacity to alter audio or video to make it appear, falsely, that a real person said or did something). The post generated a considerable amount of discussion, which was great, but we understood we had barely scratched the surface of the issue.
FOSTA: The New Anti-Sex-Trafficking Legislation May Not End the Internet, But It’s Not Good Law Either
Amid the chaos of the last week, one of the most significant pieces of internet legislation of the last two decades went relatively unnoticed.
“We are truly fucked.” That was Motherboard’s spot-on reaction to deepfake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg. As Julian Sanchez tweeted, “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy.