Russia and the "deep fake" videos

Foreign election meddling is back in the news again as we grind our way into the thick of the 2020 election cycle. As you might expect, the fingers are pointing at Russia in general and Vladimir Putin in particular. But this time around they allegedly won’t just be setting up bogus Twitter and Facebook accounts to rile people up with red herring stories. The Russians are supposedly working on “deep fake” videos which can make pretty much any politician say and do whatever you like. (Washington Times)

U.S. leaders say Vladimir Putin used a familiar cyber playbook to “muck around” in the midterm elections last month, but intelligence officials and key lawmakers believe a much more sinister, potentially devastating threat lies just down the road — one that represents an attack on reality itself.

Policy insiders and senators of both parties believe the Russian president or other actors hostile to the U.S. will rely on “deep fakes” to throw the 2020 presidential election cycle into chaos, taking their campaign to influence American voters and destabilize society to a new level.

The eerie process, which relies on cutting-edge, deep-learning algorithm technology, produces high-quality audio and video of individuals saying things they never said or doing things they never did.

My first question was whether or not anyone with even a room temperature IQ is going to be fooled by a fake video of a presidential candidate. Turns out I’m well behind the times on this one. The software to create deep fakes is everywhere and people can master the required skills fairly easily. You can probably guess where this software first sprang up, right? It came from the pornography industry, where people were putting the faces of famous people on the bodies of porn actors.

So how good are these videos? Gizmodo did a deep dive on deep fakes last summer. This short video shows you how even free software packages can make the transformation happen.

That’s kind of scary, but this should still be fairly easy to shoot down, right? All of this technology relies on having a real source video to work from. If you see a video popping up of one of the candidates saying something completely shocking or out of character, your first instinct should probably be to point it out and question its authenticity. And given how the technology works, the perpetrators have to be working from a real, original video to produce the fake. If you can find the original video, the hoax is exposed.

Can this type of scheme actually be used to tamper with our elections? I suppose there are deep corners of social media where some people will believe anything they see if it supports their preconceived notions, and such things might be passed around in email chains. But I would certainly hope that they’re not going to make it into mainstream reporting.

Trending on Hotair Video
David Strom 12:01 PM on October 07, 2022