Uncategorized

Video Is No Longer Proof

Hollywood movies dazzle us with special effects, creating imaginary and impossible worlds, events, characters, and viewing perspectives. It’s a good thing we adults can walk away from horror movies like “Alien: Covenant” secure in the knowledge that the hyper-realistic monster we just saw is a fiction drawn up by a team of digital animators who spent hundreds of hours on each scene. Because we know we’re watching a movie, we know not to believe what we are seeing. This concept must be explained to children.

What happens when technology advances to the point it no longer takes hundreds of hours or years of expertise to produce digital video? What happens when programs are developed that can create a video of any person saying anything?

The Tech

Programmers at the University of Washington have developed just such a program. It can transform online videos of a person speaking into a new video of them saying something completely different. Artificial intelligence is used to make the mouth movements exactly match a spliced-in speech.

 

 

 

 

 

 

 

 

 

This announcement comes a shortly after a Canadian startup company developed AI that can recreate any person’s voice based on sample clips. They can create a convincing recording of any person saying anything.

Both teams promoted their inventions with custom-made audio and video of political leaders, as if to highlight – or perhaps sell – the political implications of this technology. The Canadian startup Lyrebird created speeches by Donald Trump, Barak Obama, and Hillary Clinton. The University of Washington team created a series of videos of Barak Obama staring into the camera, saying words put in his mouth by the programmers.

Follow the links above and watch or listen to the clips produced by the software. Could you tell they’re fabrications?

Together, these programs could fabricate an entire presidential speech. Why might someone do that?

Implications for Elections

In the past, the difficulty of fabricating audio or video was sufficient reason to consider it unalterable.

The Watergate tapes, for example, were considered smoking-gun evidence of President Richard Nixon’s involvement in the Watergate Scandal. No one at the time could seriously imagine them being modified or falsified, although a minor scandal broke out when a secretary purportedly erased eighteen and a half minutes of conversation about the Watergate break-in by accident. The assumed authenticity of tape recordings brought down a president.

Similarly, when video surfaced of Donald Trump bragging about his ability to sexually assault women because of his fame, he did not bother to deny the video’s authenticity, and instead apologized. He had no choice. It was on video.

Will that be the case when a similarly incriminating video emerges in the 2020 election?

Soon, any political leader recorded saying or doing something offensive may point to the availability of commercial software that could be used by his or her political opponents to create a fake recording. This sort of excuse will be plausible by then, no matter how far-fetched it seems now.

Imagine it’s 2020 and news breaks that a live microphone recorded a candidate muttering a racial slur. The candidate claims the audio was faked by supporters of their opponent. In a world of off-the-shelf audio fabrication software, is there any reason to believe one side or the other?

Perhaps the falsification will be more subtle. Perhaps a partisan media outlet or a political party’s own editors will modify one or two words in a speech to make it mean something different. Perhaps arguments will break out between rival news outlets or social media companies over which video was the original.

In any case, the result will be a lot of confused voters. It’s the voters who don’t know they are confused who are most concerning.

Don’t Be Misled

Video and audio fabrication software will be commercially available to political organizations and special interest groups within months.

Eventually, you’ll have a cell phone app that can fabricate videos of your friends doing or saying sketchy things, just like we use Snapchat today to put dog noses on our selfie videos.

The way we think about video or audio will never be the same.

We can expect this technology to be deployed in our next election. The question is, will we believe our own eyes and ears out of habit? How do we shift toward understanding that everything we see or hear might have been fabricated? Research already indicates people are horrible at recognizing altered photos.

How do we avoid being duped in this future of faked scandals and plausible deniability?

We must make the same sort of mental shift we make while watching a horror movie. When we know we’re watching fiction, we think differently about what we’re seeing.

Imagine bringing a person from 100 years ago into our time, and having them watch a modern horror film. Without cultural inoculation or awareness of technological developments, they would be traumatized. The electorate is probably in a similar position. Most of us still trust audio and video, but technology has changed faster than our mindsets.

Social media will soon be bombarded with shocking viral videos that are fabricated.

The next time a video emerges of a political candidate stuffing cash into a suitcase, having a sexual affair, or smothering a kitten, we should understand that we are probably watching a partisan’s video creation, and somehow not let that image affect our perceptions of the candidate.

If there is a bright side to all this, it may be that increasing numbers of people will realize all they have to do is read the candidates’ positions on the issues and vote accordingly. You can ignore all the rest!

Eventually, the Internet might become completely discredited as a source of information. Something – perhaps a resurgent and transformed mainstream media – will have to fill the credibility void without being falsifiable itself.

We had better graduate to this level of democratic maturity before our elections become a horror movie.