Facebook Cracks Down on Deepfake Videos Ahead of 2020 US Election

Social network will remove videos that include edits “that aren’t apparent to an average person and would likely mislead someone”

Facebook is cracking down on deepfakes and will be more aggressive in the removal of manipulated videos, the company said on Monday night.

“While these videos are still rare on the internet, they present a significant challenge for our industry and society as their use increases,” Facebook VP Monika Bickert wrote in a blog post.

Deepfakes are videos that utilize faked or out of context images to depict a fictional event that is presented as real. They often superimpose the face of one person onto another person and other times, the lips and voice of someone are manipulated to look like they’re saying something they didn’t actually say.

Facebook will implement a new policy to judge edited content. The policy will determine whether a video is removed based on two factors:

— If a video has been edited “beyond adjustments for clarity or quality – in ways that aren’t apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say,” according to Bickert.

— If the artificial intelligence or product used on the video makes it “appear to be authentic,” Bickert added.

The new rules do not completely ban deepfakes from Facebook, however. The company will allow content that is “parody or satire, or video that has been edited solely to omit or change the order of words” to remain on its platform, according to Bickert.

Facebook’s decision comes ahead of the 2020 U.S. election and days before Bickert is set to appear before Congress. Last May, Facebook’s critics called for the company to remove a doctored video of House Speaker Nancy Pelosi slurring her words; the company “downranked” the clip, curtailing how far it spread on users’ news feeds, but did not remove the video.

So far, political deepfakes haven’t been a major issue. But that could soon change in the years ahead, as it becomes more difficult to determine whether a video has been edited. Right now, most deepfakes still have something a bit off about them — the person’s cadence might be a bit clunky, or their movements are a little robotic.

Siwei Lyu, head of the computer vision and machine learning lab at the University of Albany, warned in an interview with TheWrap last year there are a few “visual artifacts” viewers can use to spot deepfakes. The first hint? The video subjects “do not blink very much,” Lyu said. If the subject is consistently going 10 seconds or more without blinking, it’s likely a deepfake. Another hint is that the subjects in the video are rarely able to turn side to side. (Both are technological hangups stemming from the process of copying one person’s face onto someone else’s.)

Soon, though, those blemishes will be wiped away, as editing technology continues to improve. Once deepfakes make it through the “uncanny valley,” where their inauthenticity is unable to be detected just by watching the video, a new, more powerful brand of fake news could become widespread.

“You now have an emerging technology that has a much greater capacity for persuasion,” Robert Chesney, a law professor at the University of Texas, told TheWrap.

Hollywood has already seen the disturbing ways deepfakes can be used in recent years. A cottage industry has emerged where the faces of stars like Gal Gadot and Daisy Ridley are swapped onto porn. Some of these videos have racked up millions of views.

One bogus video falsely described as “leaked” footage of Scarlett Johansson has been viewed more than 1.5 million times on a popular porn site.

Johansson bleakly called it a futile battle. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson told The Washington Post in late 2018. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause.”

Comments