top of page

UNTITLED may earn an affiliate commission when you purchase through the links on our site. Find out more here.

  • Writer's pictureReece Bithrey

The Trouble With Deepfakes - Should We Be Worried For The Future?


(Picture Credit - The Washington Post)

There's a chance that you might've seen the video released a couple of years ago where Barack Obama appeared to call Donald Trump a "dipshit." or maybe you've seen the one before the last UK General Election where Boris Johnson endorsed Jeremy Corbyn and Corbyn endorsed Johnson. Frightening, isn't it? Well, this might just be the future of both video production and uses of artificial intelligence - they're known as deepfakes.


Fundamentally, a deepfake video is a video that has been altered in some way to create a fake video of someone speaking out of turn, or something happening that wouldn't have ordinarily occurred. Most of them originally occurred in pornographic spheres - AI firm Deeptrace found around 15,000 deepfakes online back in September 2019 and a ludicrous 96% of them were pornographic, and of that 96%, 99% of them had mapped the faces of female celebrities onto actors in those films. The ones you've most likely seen have been satirical videos such as the Obama video from 2018 that's below, or maybe even Mark Zuckerberg appearing to say that Facebook's true goal is to exploit its user base.

The way that these videos work is the frightening thing. It involves the artificial synthesis of a voice actor speaking as someone (e.g. an impressionist taking off Donald Trump) and then matching that voice to a specially-created video of Trump where he mimics the voice actor's movements thanks to thousands of pictures used to plant the US President's face atop that of the impressionist. As time has passed, it's become harder and harder to spot the spoof videos, but the dead giveaway is usually the static head movements of the famous person in question. The hair might not look quite right either, and the voice might not sound exactly perfect - voice actors can only do so much after all.


Before long, there may be a deepfake video of a world summit involving various world leaders making knee-jerk and controversial decisions that looks so accurate that the majority of people won't notice it's a fake. The point with such technology is that it can be used for both good and bad, and if the wrong people get hold of it, then chances are it's going to be used for bad. As the technology becomes more widespread in years to come and the amount of processing power needed to create one that's ultra-convincing decreases, the classic deepfake video will only become more widespread as it becomes easier to manufacture them. It doesn't really require much imagination or thought to think of a situation to put someone famous into that could tarnish their reputation. A future President might declare war on a nation, a television presenter might be 'filmed' making a racist outburst - the opportunities here are endless. With all this, there will come a time when an entire population will see a fake video but believe that it's real. In a world where uncertainty can exist over the authenticity of a video clip, the potential fallout and consequences will be unlike anything else. The scale of influence with such technology is scary.


The idea of superimposing someone's face on someone else is really nothing new. The Face Swap Live app launched in 2015 that allowed for a primitive deepfaking where two people's faces got swapped over in real time gave the general public a first look at the technology, and in the past five years, the fakes have become more real. It's a five minute job to layer someone's face over the top of another on Photoshop if you know what you're doing and it's only a matter of time before deepfaking becomes the norm. Soon enough, the giveaway things like eye or head movement will become the past and it'll be a lot harder to spot. There are people out there trying to combat this at the moment - they too are using AI to be trained to spot fake videos of famous people and point them out as deepfakes.


It's important to stress here that whilst there's evidence to suggest that these videos can be used in malicious ways, they can be used as a force for good. Audio deepfakes designed to clone voices can be used to bring someone's voice back when they are lost to illness and can be used to help improve dubbing on foreign-language films, or even bring back dead actors for starring roles - the late James Dean is due to star in a new Vietnam War drama Finding Jack.


It looks like that the deepfake technology is here to stay for the time being and to be honest, who knows what's going to happen next? Maybe it'll be world of malpractice that's been touted for the last few months, or maybe they'll end up being a force for good and people might stop taking the mick. At this moment, your guess is as good as mine.

bottom of page