Fakeout

September 20, 2019
HIGH-TECH To make a deepfake video, people use software to map a subject’s face. Then facial features can be moved around, like those of a puppet. This spring, artists circulated a deepfake of Facebook’s Mark Zuckerberg to see if people could tell it was false.
DAVID PAUL MORRIS—BLOOMBERG/GETTY IMAGES. PHOTO ILLUSTRATION BY STEPHEN BLUE FOR TIME FOR KIDS

Deepfakes are videos with visual or audio content that has been manipulated. They can make it seem as if the video’s subject is saying words he or she hasn’t actually spoken. Videos like these used to be made only by trained special-effects artists. Today, anyone with the right tools can make a good deepfake.

Deepfakes may be dangerous. People are worried they will be used to trick the public. Imagine if someone made a deepfake of a world leader or other powerful person. How would viewers know not to believe what they were seeing and hearing?

Wael Abd-Almageed wants to answer that question. He leads a team of five other researchers at the University of Southern California. Abd-Almageed and his team designed computer software. It can determine whether a video is a deepfake. The software uses artificial intelligence to search the video for clues. The clues tell the team if the subject’s face has been manipulated. “If there is inconsistency in the video, such as how the eyes and mouth move, we can spot it,” Abd-Almageed told TIME for Kids.

Don’t Be Fooled

How can you avoid being fooled by a deepfake? Abd-Almageed has some advice. He says not to immediately trust a video that you see online. Research it first. He advises asking yourself, “Would this person actually say something like this?” Look into the video’s source, too. Who made it? Who posted it?

Abd-Almageed also says you should watch videos at a slower speed to spot inconsistencies. This is possible using the settings on most video platforms, such as YouTube. But he believes deepfakes will continue to become more advanced. “Every day, someone will create a better deepfake,” Abd-Almageed says. “We have to try to detect it.”