The Dangers and Benefits of Deepfake Advancements

by Vendela Krenkel ‘20

For years, doctored images and videos have been a concern for personal and national safety, but with the development and popularization of deepfake programs, the potential for catastrophic consequences has increased dramatically. Today, anyone can download the code required to create altered videos of whoever they want, whenever they want.

Deepfake technology can be used in the name of fun. Free apps like Sway and Everybody Dance Now! can take a video of a user standing still and create a clip of the person dancing. Some social media influencers have taken the edge off of their audiences’ apprehension towards this technology by using the most popular programs to doctor videos of their peers or favorite celebrity in silly, unrealistic contexts. Youtuber Dr Fakenstein posted a video of an Oprah interview with Mike Tyson’s face.

However, concerns arise of manipulated media “being used to disrupt democracies, sow civil unrest, revenge porn, [disinformation], and so on and so forth,” as digital authentication expert Hany Farid bluntly explains.

Clips of a money-laundering tutorial by Donald Trump for his son-in-law, Trump announcing that “AIDS is over,” and Obama cursing in a public service announcement about deepfakes have gone viral, but they have all been doctored. Ragavan Thurairatnam, chief of machine learning at the tech company Dessa, said that he is confident deepfakes will have an impact on the 2020 election.

All a person needs to create a deepfake is the code, a video of the target in the intended compromising position or location, a voiceover that sounds similar to the target, and time. For a basic deepfake, the program takes about 24 hours, but for a more convincing product, the video may take over two weeks. Although the technology is not there yet, Farid worries about the potential for a simple program that can induce career and social sabotage. Some content creators post deepfakes of a celebrity’s face on an adult star’s body on pornographic websites. The traction and revenue these generate is indicative of a shift in the ethics of the industry towards moral ambiguity.

There are few guidelines in place to restrict misleading content from the sites that host adult films. Because they can be construed as parody, it can be difficult to pursue legal action against deepfake creators. Currently, only four states regulate deepfakes with legislation, with California taking the lead in providing a route of action for victims of malicious deepfake audio or video. However, victims can seek legal action through a civil suit if they can prove that the deepfake violated copyright, or the perpetrator extorted or harassed the individual.