Deepfakes: Will we ever re-establish our trust in videos?

Paris Theodorou is a solicitor at Saunders Law. Abigail Bright is a barrister at Doughty Street Chambers.

Default Image

As lawyers we must make sure we are well informed and ready to tackle one of the greatest threats to our democracy – and profession. Videos and audio recordings, unless shown to be doctored, were once valued as independent and objective evidence. This is no longer the case. Deepfake technology can be used to make it appear as though a person is saying things in a video which they have never said. It's when a picture is not worth a thousand words.

What is a deepfake?

Deepfakes (from 'deep learning' and 'fake') are convincing fake videos or audio recordings that look and sound just like the real thing. A person in an existing image or video is replaced with someone else's likeness using artificial neural networks. Or, artificial intelligence (AI) facial and sound recognition is used to generate synthetic video to make it appear as though a real person is saying things which they have never said.

Deepfake technology can recognise and reproduce the natural cadences in human speech. AI algorithms are used to predict facial movements and are able to depict the sound of the human voice.

How are deepfakes used?

The threat posed by a single deepfake has real potential to be wide-reaching. Deepfakes are now prevalent and have been used in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud.

Deepfakes present unprecedented cyber and reputation risks both to businesses and private individuals by using new information technologies for sophisticated fraud, identity theft, and public shaming.

Deepfakes and the 2019 UK general election

Recently, a deepfake video falsely portrayed the Prime Minister Boris Johnson endorsing the leader of the opposition to be Prime Minister. Deepfakes can help swing an election, dupe you out of transferring money to a fraudster, and even convince you that a lewd act is done, or a racist epithet is spoken, by a person who's never done any such thing.

Picture this. On 12 November, BBC News broadcast a video clip showing Boris Johnson speaking and endorsing the candidacy of Jeremy Corbyn, a rival for the post of Prime Minister. Uncanny.

The video was posted on social media by a think tank called @FutureAdvocacy. What's convincing is that there are no speed bumps or false starts in the video. You can't even tell the difference between the real Boris Johnson and this deceptive mimic. And it's only a snippet: it's a video you just scroll past on your feed. The Boris Johnson seen in this deepfake video is animated by voice impressionist Darren Altman. Mr Johnson says - well, he seems to say: "Hi folks, I am here with a very special message. Since that momentous day in 2016, division has coursed through our country as we argue with fantastic passion, vim and vigour about Brexit. My friends, I wish to rise above this divide, and endorse my worthy opponent, the Right Honourable Jeremy Corbyn, to be Prime Minister of our United Kingdom."

Future Advocacy next posted a video which showed - seemingly - Jeremy Corbyn hailing Boris Johnson for leadership. Again: the video is convincing. A simulated Mr Corbyn implores the nation to vote for Mr Johnson: "Once upon a time, I called for a kinder, gentler politics. However, we, the political class here in Westminster, have failed, and the consequences have been disastrous for our society. That's why I'm taking on the toxic culture in Parliament. I'm urging all Labour members and supporters to consider people before privilege and back Boris Johnson to continue as our Prime Minister."

BBC News asked: "Would you be fooled by this video? It shows Boris Johnson and Jeremy Corbyn endorsing each other to become the next UK PM. But it's not real. It's a so-called 'deepfake'."

Outlandish it may be, but as deepfake editor and expert Chris Ume demonstrated in his Trevor Bercow deepfake (a deepfake created by merging Trevor Noah and John Bercow), it really isn't as difficult to create a deepfake as some people may think it is. The technology is off-the-shelf.

Is the person on the other end of the line real – or a convincing deepfake?

Law firms are as exposed as corporations are to being targeted by deepfake technology, both machine learning and predictive facial recognition. Deepfake technology has the potential to clone telephone calls in which authority is given to act in a certain way. Deepfake technology defies ordinary controls which are in place to screen and intercept spam emails.

The increase in deepfakes circulating the internet has almost doubled in the last twelve months. The BBC reported in November 2019 that research conducted by cyber-security company Deeptrace found that (at least) 14,698 deepfake videos are now online, compared with 7,964 in December 2018.  

The research by Deeptrace found that 96% of deepfakes were pornographic in nature, often with a computer-generated face of a celebrity replacing that of the original adult actor in a scene of sexual activity. Scarlett Johansson has been a victim of the phenomenon and proclaimed that there was nothing her and her legal team could do about the fake videos. Ms Johansson was quoted as having said this about why her law team had abandoned efforts to shut down deepfakes touching upon her image rights: "I think it's a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself."

The reason for the prevalence of deepfakes is likely to be attributed to the relative ease with which one can make a deepfake. Chris Ume said this: "I think deepfakes are here to stay. Changing appearance, aging or de-aging, reincarnation, mouth manipulation, these are just a few examples of many. Just like pictures years ago people will now realise you shouldn't believe anything you see or hear in a video."

With a general election imminent and the increase in deepfakes writ large, will we ever re-establish our trust in visual media?


Deepfakes are a global phenomena and providing cybercriminals, activists and other persons with tools to commit fraud and influence the general public. Although the vast majority of these videos are created with a pornographic element, the technology is increasingly being used to target individuals for a number of sinister objectives.

Awareness of deepfakes alone is not enough. Lawyers are now required to be more objective when analysing video content and be in a position to advise on credibility, admissibility and perceptibility. 

With a constantly evolving landscape and the development of AI it is of paramount importance that we are informed and prepared to tackle any future challenges to safeguard our profession. It is for us to uphold the integrity of our system.


Views expressed in our blogs are those of the authors and do not necessarily reflect those of the Law Society.

Download our introductory guide to #LawTech, with advice in particular for smaller firms and sole practitioners, who often lack the resources of larger legal businesses when considering adopting new technology

Download our new research report Technology, Access to Justice and the Rule of Law that contains findings and recommendations about using innovation and technology to facilitate access to justice

Podcast: We discuss our new research report and what barriers could impact using technology to improve access to justice

Explore our campaign work on improving access to justice including early advice, criminal justice, criminal duty solicitors, and legal aid deserts.

Maximise your Law Society membership with My LS