Share

Dealing with Video Deepfakes in the Era of Artificial Intelligence

Artificial Intelligence (AI) — like most new technologies — cuts multiple ways. The technology’s benefits are amazingly helpful in video and audio editing applications, as well as in still photography, search engines and even tedious tasks like audio transcriptions.

But, like most new technology, AI has a negative side. The most apparent are today deepfakes — altered video and audio programs that can be used maliciously.

AI-generated fake videos are becoming more common because they are now easier to create. If the technology continues to rapidly improve, it is no doubt more scammers and propagandists will use fake images like spam email.

Deepfakes in videos allow the switching of a subject’s face with another body. These very realistic fake characters can be made to say anything, even sounding exactly like the genuine voice of the real person. Not all such videos are made with malicious intent. Some are just memes, designed to be witty.

But AI technology is improving fast — it seems almost daily. As the technology gets good enough to become indistinguishable from fact and fiction, the implications should scare all of us.

Think of the damage of creating a fake image of the President of the United States declaring an imaginary war. It could easily happen and millions of people would no doubt believe it. AI could make the famous War of the Worlds radio broadcast that scared the nation more than 80 years ago look harmless in retrospect.

In fact, anyone can be made into a believable character saying anything desired. The potential for societal damage with such a powerful technology is downright frightening.

Creating truly deceptive deepfakes is still a painstaking task done well only by experts. But with AI’s fast-moving development, anyone in the public will soon be able to easily make professional-quality fake images.

Mark Zuckerberg – AI Mapping

The creation of deepfakes requires machine learning. The machine has to process hours of real-life video footage of a person from different angles and lighting conditions. After the machine has a complete understanding of how the person looks, the deepfake creator uses computer graphics to superimpose the figure onto a different body. Then the image is then tweaked to make it super realistic.

Harrison Ford, the 80-yearold actor, recently had his face de-aged by a quarter of a century as the Indiana Jones character for the upcoming film, Dial of Destiny. He looks in the film like he did in his mid-50s.

“They have this artificial intelligence program that can go through every foot of film (of my face) that Lucasfilm owns…including film that wasn’t printed,” Ford said in a television interview with Stephen Colbert. “Then they put little dots on my face and I say the words and they make it [happen]. It’s fantastic.”

How can you spot most deepfakes today? With most amateur jobs the first signs are facial discoloration or strange looking artifacts where the face meets the subjects’ neck and hair. Sometimes it is very hard to tell, but by looking closely one can see if the skin doesn’t precisely match the lighting.

In video, eye blinking can be a giveaway. People normally blink their eyes once every two to ten seconds. Deepfakes often don’t blink normally. Count the number of eye blinks to see if it fits in the human range.

Audio can also signal a deepfake video. Sometimes the sound and video are not in sync. Try to find the same video online since the deepfake was probably created from a video found on the internet.

Today, the best way to spot a deepfake is facial discoloration; eyes and teeth are unrealistic; rate of eye blinking; strange lighting; audio and video doesn’t sync; blending of the face looks off; and a fuzziness where the face meets the hairline.

But these are today’s amateur defects. As technology improves these flaws will disappear and the technology will become far more precise. When the human eye cannot tell the difference it will become more ominous.

The controversy now over deepfakes is a good thing. We should be worried about it. While it a great technology that can ease and improve many media-making tasks, it’s downside could be disastrous for society as a whole.

We have always trusted what we see — even when a figure is lying, most of us sublimely know it. But we will have to adjust to a coming flood of faked imagery, especially on social media. How will we know what to believe?

There needs to be a better way to easily determine deepfakes and to label them for what they are. The technology that created this must also find a way to combat its’ misuse.

At the same time, schools need to teach media skills that help students distinguish between what is real and what is not. And we need laws to help prevent deepfakes.

We are entering an era where what we see and hear may not be real. False information about any subject may become intensified. What will happen in the long run, nobody knows.

Writer at Broadcast Beat
Frank Beacham is a New York-based writer, director and producer who works in print, radio, television, film and theatre.

Beacham has served as a staff reporter and editor for United Press International, the Miami Herald, Gannett Newspapers and Post-Newsweek. His articles have appeared in the Los Angeles Times, Washington Post, the Village Voice and The Oxford American.

Beacham’s books, Whitewash: A Southern Journey through Music, Mayhem & Murder and The Whole World Was Watching; My Life Under the Media
Microscope are currently in publication. Two of his stories are currently being developed for television.

In 1985, Beacham teamed with Orson Welles over a six month period to develop a one-man television special. Orson Welles Solo was canceled after Mr. Welles died on the day principal photography was to begin.

In 1999, Frank Beacham was executive producer of Tim Robbins’ Touchstone feature film, Cradle Will Rock. His play, Maverick, about video with Orson Welles, was staged off-Broadway in New York City in 2019.
Frank Beacham
Broadcast Beat - Production Industry Resource