Amid the hubbub of L’Affaire Ukrainienne, you could be forgiven for overlooking another story that has emerged out of Congress over the past week. It’s a grubby, unpleasant story—so much so that it feels ugly to draw attention to it. But the times are ugly, after all, and the story is a concerning harbinger of what might be to come in the lead-up to 2020.
Latest in deep fakes
Livestream: HPSCI Hearing on the National Security Challenges of Artificial Intelligence, Manipulated Media and Deepfakes
The House Permanent Select Committee on Intelligence will host a hearing entitled "The National Security Challenge of Artificial Intelligence, Manipulated Media, and ‘Deepfakes’" at 9:00 a.m. on Thursday. A video of the hearing is available below.
In the summer of 2016, a meme began to circulate on the fringes of the right-wing internet: the notion that presidential candidate Hillary Clinton was seriously ill. Clinton suffered from Parkinson’s disease, a brain tumor and seizures, among other things, argued Infowars contributor Paul Joseph Watson in a YouTube video. The meme (and allegations) were entirely unfounded.
Back in February, we joined forces in this post to draw attention to the wide array of dangers to individuals and to society posed by advances in “deep fake” technology (that is, the capacity to alter audio or video to make it appear, falsely, that a real person said or did something). The post generated a considerable amount of discussion, which was great, but we understood we had barely scratched the surface of the issue.
Fake news is bad enough already, but something much nastier is just around the corner: As Evelyn Douek explained, the “next frontier” of fake news will feature machine-learning software that can cheaply produce convincing audio or video of almost anyone saying or doing just about anything.
Bobby Chesney and Danielle Citron have painted a truly depressing picture of a future in which faked video and audio cannot be distinguished from the real thing. And I think they are right to be depressed about it, though I want to discuss a possible technological solution that they did not address.
“We are truly fucked.” That was Motherboard’s spot-on reaction to deep fake sex videos (realistic-looking videos that swap a person’s face into sex scenes actually involving other people). And that sleazy application is just the tip of the iceberg. As Julian Sanchez tweeted, “The prospect of any Internet rando being able to swap anyone’s face into porn is incredibly creepy.