Ramifications of Deepfakes go beyond porn

Fake news has been the buzzword of the past few years at this point, but the proliferation of certain advancements has the power to elevate the issue of fake news to a whole new level. Unfortunately, the only term to describe this new process at the moment is Deepfakes. A bit of fair warning, looking up this word will be met with a slew of articles on and potentially examples of the main use for now: pornography.

Deepfakes, simply put, work by mapping the face of one person onto another. The technology to accomplish this has been available to those with resources and money for a while, but recently a program has been made that allows anyone with a moderately powerful computer to create these. The program analyzes images of the target faces, eventually working out how to swap the faces with decent accuracy. This allows faces to be put into other scenes with attention to lighting, color and position.

First gaining traction on Reddit before being banned, the Deepfakes “community” has primarily worked on putting the faces of celebrities onto the bodies of porn actresses. Additionally, many speculate that this process can be extended to work for average people as well. The fear ringing through many people and outlets is about how Deepfakes generation plays into consent and revenge. A bitter ex-partner or creep could easily get footage and images of their target to create realistic revenge porn, incriminating their victim.

As real as these fears are in the short term, however, I fear the implications of this technology more generally for information and news. As seen in some early jokes - like Donald Trump placed onto Hillary Clinton - the Deepfakes process works just as well on politicians as it does on celebrities. While most are preoccupied on porn for now, what happens once Deepfakes creators turn their eye to more serious endeavors?

For now, the Deepfakes are impressive but not quite convincing. In the Nicolas Cage fakes that have also attracted much interest, it is obvious that Nicolas Cage is not actually acting in the movies his face has been transplanted into. However, if there is a political or financial motive severe enough, or just a person with too much time on their hands, it would be trivial to create a realistic voice, script and set to mimic someone actually saying something. With time, the program itself - already in its second iteration - is also sure to get refined to make more stable and convincing masks.

With enough time, source material and processing power, it will soon be possible to make Donald Trump or anyone else say practically anything. An individual looking to cause panic could have Kim Jong Un claim to be bombing the United States. An organization looking to actually incite something could make local politicians announce fake policies. The goal for both would be to get media to pick up the stories, proliferating fake news far and wide.

There is no real way to combat this inevitability. The technology is quickly coming, and there will be people looking to abuse it. The only hope is that Deepfakes will eventually be common enough to promote better avoidance of fake news. If it is well known that anyone can create false interviews or announcements, people would hopefully only look to reliable, reputable sources for news. This is both an optimistic and distant hope, though. For the foreseeable future, Deepfakes represent a legitimate threat truth, one that can only be dealt with using vigilance, perception and research.


Peter Fenteany is a weekly columnist for The Daily Campus. He can be reached via email at peter.fenteany@uconn.edu.