To cleanse the palate, does this seem familiar? It should.
It’s similar to this technology, which I wrote about two years ago. Researchers in that case were able to record facial expressions and mouth movements from a random person and sync them up via software to known footage of a famous person, like Barack Obama. Result: Obama would appear to make the same expressions and mouth movements as the random person, to a degree of verisimilitude almost indistinguishable from real video footage of Obama himself. If you could do the same thing with audio, replicating a known person’s voice, you could conceivably create *very* convincing A/V footage of anyone saying or doing anything you want. And once you can do that, the era of “fake news” will finally arrive in full flower.
If we’ve known about this for two years, though, what’s so interesting about the Alec Baldwin/Trump clip above? Ah, well, the Obama clip in my 2016 post was generated by a research team that included techies at Stanford, the Max Planck Institute, and the University of Erlangen-Nuremberg. The Baldwin/Trump was generated by … a guy on reddit using his home computer.
And so the era of “fake news” has arrived at last:
In December, Motherboard discovered a redditor named ‘deepfakes’ quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities—including Gal Gadot, Maisie Williams, and Taylor Swift—using a machine learning algorithm, his home computer, publicly available videos, and some spare time…
Another user, called ‘deepfakeapp,’ created FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes’ algorithm, but deepfakeapp created FakeApp without the help of the original deepfakes. I messaged deepfakes, but he didn’t respond to a request for comment on the newfound popularity of his creation…
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
Follow the link and have a look at the short (safe for work) GIF of Daisy Ridley’s face on a porn star’s body. This technology is still in its fancy and the verisimilitude is already *that* good. Here’s a side by side comparison of what Disney was able to do in re-creating a young Carrie Fisher for “Rogue One” and what a skilled hobbyist was able to do now with the deepfakes technology. Disney is on top, the fake is on the bottom:
At least they haven’t perfected replicas of human voices yet to match with the audio, right? Well, hold on — as Julian Sanchez recently noted, they’re working on that too. For the wealthy and powerful, it’s a double-edged sword. They’ll overwhelmingly be the prime targets of hobbyist fake-video projects, but as the quality of each fake gets better, the credibility of each real scandalous image and soundbite will decline. Imagine if the “Access Hollywood” tape appeared in, say, 2022, with Trump in his second term and this technology more ubiquitous. Who would believe it? To some extent, the powerful will be scandal-proof. And meanwhile governments, with their resources, will be able to produce the highest quality fakes of whatever they want to promote. God help us.
One last clip for your amusement. A technology that places Nicolas Cage in increasingly ridiculous situations can’t be all bad.