You are here:
  1. Home
  2. News
  3. Blog
  4. Deepfakes: Will we ever re-establish our trust in videos?

Deepfakes: Will we ever re-establish our trust in videos?

03 December 2019

Paris Theodorou is a solicitor at Saunders Law. Abigail Bright is a barrister at Doughty Street Chambers.

 

As lawyers we must make sure we are well informed and ready to tackle one of the greatest threats to our democracy – and profession. Videos and audio recordings, unless shown to be doctored, were once valued as independent and objective evidence. This is no longer the case. Deepfake technology can be used to make it appear as though a person is saying things in a video which they have never said. It's when a picture is not worth a thousand words.

What is a deepfake?

Deepfakes (from 'deep learning' and 'fake') are convincing fake videos or audio recordings that look and sound just like the real thing. A person in an existing image or video is replaced with someone else's likeness using artificial neural networks. Or, artificial intelligence (AI) facial and sound recognition is used to generate synthetic video to make it appear as though a real person is saying things which they have never said.

Deepfake technology can recognise and reproduce the natural cadences in human speech. AI algorithms are used to predict facial movements and are able to depict the sound of the human voice.

How are deepfakes used?

The threat posed by a single deepfake has real potential to be wide-reaching. Deepfakes are now prevalent and have been used in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud.

Deepfakes present unprecedented cyber and reputation risks both to businesses and private individuals by using new information technologies for sophisticated fraud, identity theft, and public shaming.

Deepfakes and the 2019 UK general election

Recently, a deepfake video falsely portrayed the Prime Minister Boris Johnson endorsing the leader of the opposition to be Prime Minister. Deepfakes can help swing an election, dupe you out of transferring money to a fraudster, and even convince you that a lewd act is done, or a racist epithet is spoken, by a person who's never done any such thing.

Picture this. On 12 November, BBC News broadcast a video clip  showing Boris Johnson speaking and endorsing the candidacy of Jeremy Corbyn, a rival for the post of Prime Minister. Uncanny.

The video was posted on social media by a think tank called @FutureAdvocacy. What's convincing is that there are no speed bumps or false starts in the video. You can't even tell the difference between the real Boris Johnson and this deceptive mimic. And it's only a snippet: it's a video you just scroll past on your feed. The Boris Johnson seen in this deepfake video is animated by voice impressionist Darren Altman. Mr Johnson says - well, he seems to say: "Hi folks, I am here with a very special message. Since that momentous day in 2016, division has coursed through our country as we argue with fantastic passion, vim and vigour about Brexit. My friends, I wish to rise above this divide, and endorse my worthy opponent, the Right Honourable Jeremy Corbyn, to be Prime Minister of our United Kingdom."

Future Advocacy next posted a video which showed - seemingly - Jeremy Corbyn hailing Boris Johnson for leadership. Again: the video is convincing. A simulated Mr Corbyn implores the nation to vote for Mr Johnson: "Once upon a time, I called for a kinder, gentler politics. However, we, the political class here in Westminster, have failed, and the consequences have been disastrous for our society. That's why I'm taking on the toxic culture in Parliament. I'm urging all Labour members and supporters to consider people before privilege and back Boris Johnson to continue as our Prime Minister."

BBC News asked: "Would you be fooled by this video? It shows Boris Johnson and Jeremy Corbyn endorsing each other to become the next UK PM. But it's not real. It's a so-called 'deepfake'."

Outlandish it may be, but as deepfake editor and expert Chris Ume demonstrated in his Trevor Bercow deepfake (a deepfake created by merging Trevor Noah and John Bercow), it really isn't as difficult to create a deepfake as some people may think it is. The technology is off-the-shelf.

Is the person on the other end of the line real – or a convincing deepfake?

Law firms are as exposed as corporations are to being targeted by deepfake technology, both machine learning and predictive facial recognition. Deepfake technology has the potential to clone telephone calls in which authority is given to act in a certain way. Deepfake technology defies ordinary controls which are in place to screen and intercept spam emails.

The increase in deepfakes circulating the internet has almost doubled in the last twelve months. The BBC reported in November 2019 that research conducted by cyber-security company Deeptrace found that (at least) 14,698 deepfake videos are now online, compared with 7,964 in December 2018.  

The research by Deeptrace found that 96% of deepfakes were pornographic in nature, often with a computer-generated face of a celebrity replacing that of the original adult actor in a scene of sexual activity. Scarlett Johansson has been a victim of the phenomenon and proclaimed that there was nothing her and her legal team could do about the fake videos. Ms Johansson was quoted as having said this about why her law team had abandoned efforts to shut down deepfakes touching upon her image rights: "I think it's a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself."

The reason for the prevalence of deepfakes is likely to be attributed to the relative ease with which one can make a deepfake. Chris Ume said this: "I think deepfakes are here to stay. Changing appearance, aging or de-aging, reincarnation, mouth manipulation, these are just a few examples of many. Just like pictures years ago people will now realise you shouldn't believe anything you see or hear in a video."

With a general election imminent and the increase in deepfakes writ large, will we ever re-establish our trust in visual media?

Advice

Deepfakes are a global phenomena and providing cybercriminals, activists and other persons with tools to commit fraud and influence the general public. Although the vast majority of these videos are created with a pornographic element, the technology is increasingly being used to target individuals for a number of sinister objectives.

Awareness of deepfakes alone is not enough. Lawyers are now required to be more objective when analysing video content and be in a position to advise on credibility, admissibility and perceptibility. 

With a constantly evolving landscape and the development of AI it is of paramount importance that we are informed and prepared to tackle any future challenges to safeguard our profession. It is for us to uphold the integrity of our system.

 

Views expressed in our blogs are those of the authors and do not necessarily reflect those of the Law Society.

Download our introductory guide to #LawTech, with advice in particular for smaller firms and sole practitioners, who often lack the resources of larger legal businesses when considering adopting new technology

Download our new research report Technology, Access to Justice and the Rule of Law that contains findings and recommendations about using innovation and technology to facilitate access to justice

Podcast: We discuss our new research report and what barriers could impact using technology to improve access to justice

Explore our campaign work on improving access to justice including early advice, criminal justice, criminal duty solicitors, and legal aid deserts.

 

Tags: cyber security | artificial intelligence

About the author

Abigail Bright practises at Doughty Street Chambers. She specialises in serious fraud and business crime, extradition and public law.

About the author

Paris Theodorou is a solicitor at Saunders Law that specialises in criminal and civil matters. He has years of experience at top London firms, and has developed a strong practice in contentious matters, business crime and regulation. Paris has a particular expertise in technology and litigation.

Follow Paris on Twitter
Follow Paris on LinkedIn
Follow Paris on Instagram

  • Share this page:
Authors

Abigail Bright | Adam Johnson | Adele Edwin-Lamerton | Ahmed Aydeed | Alex Barr | Alex Heshmaty | Alexa Lemzy | Alexandra Cardenas | Amanda Adeola | Amanda Carpenter | Amanda Jardine Viner | Amy Bell | Amy Heading | an anonymous sole practitioner | Andrew Kidd | Andrew McWhir | Andy Harris | Anna Drozd | Annaliese Fiehn | Anne Morris | Anne Waldron | anonymous female solicitor | Asif Afridi and Roseanne Russell | Bansi Desai | Barbara Whitehorne | Barry Wilkinson | Becky Baker | Ben Hollom | Bhavisha Mistry | Bob Nightingale | Bridget Garrood | Caroline Marlow | Caroline Roddis | Caroline Sorbier | Carolyn Pepper | Catherine Dixon | Chris Claxton-Shirley | Christina Blacklaws | Ciaran Fenton | CV Library | Daniel Matchett | Daphne Perry | David Gilroy | David Yeoward | Douglas McPherson | Duncan Wood | Elijah Granet | Elizabeth Rimmer | Emily Miller | Emily Powell | Emma Maule | Floyd Porter | Gary Richards | Gary Rycroft | Graham Murphy | Greg Treverton-Jones | Gustavo Bussmann | Hayley Stewart | Hilda-Georgina Kwafo-Akoto | Ignasi Guardans | James Castro Edwards | Jane Cassell | Jayne Willetts | Jeremy Miles | Jerry Garvey | Jessie Barwick | Joe Egan | Jonathan Andrews | Jonathan Fisher | Jonathan Smithers | Jonathon Bray | Julian Hall | Julie Ashdown | Julie Nicholds | June Venters | Justin Rourke | Karen Jackson | Kate Adam | Katherine Cousins | Kaweh Beheshtizadeh | Kayleigh Leonie | Keiley Ann Broadhead | Kerrie Fuller | Kevin Hood | Kevin Poulter | Larry Cattle | Laura Bee | Laura Devine | Laura Uberoi | Leah Glover and Julie Ashdown | Leanne Yendell | Lee Moore | LHS Solicitors | Linden Thomas | Lucy Parker | Maria Shahid | Marjorie Creek | Mark Carver | Mark Leiser | Markus Coleman | Martin Barnes | Mary Doyle | Matt Oliver | Matthew Still | Max Rossiter | Melissa Hardee | Michael Henson-Webb | Neil Ford | Nick Denys | Nick O'Neill | Nick Podd | Nigel West | Nikki Alderson | Oz Alashe | Paris Theodorou | Patrick Wolfe | Paul Rogerson | Pearl Moses | Penny Owston | Peter Wright | Philippa Southwell | Preetha Gopalan | Prof Sylvie Delacroix | Rachel Brushfield | Rafie Faruq | Ranjit Uppal | Ravi Naik | Remy Mohamed | Richard Collier | Richard Coulthard | Richard Heinrich | Richard Mabey | Richard Messingham | Richard Miller | Richard Roberts | Rita Gupta | Rob Cope | Robert Bourns | Robert Forman | Robin Charrot | Rosa Coleman | Rosy Rourke | Sachin Nair | Saida Bello | Sally Azarmi | Sally Woolston | Sam De Silva | Sara Chandler | Sarah Austin | Sarah Crowe | Sarah Henchoz | Sarah Smith | Shereen Semnani | Shirin Marker | Siddique Patel | Simon Day | Sofia Olhede | Sonia Aman | Sophia Adams Bhatti | Sophie O'Neill-Hanson | Steve Deutsch | Steve Thompson | Stuart Poole-Robb | Sue James | Susa | Susan Kench | Suzanne Gallagher | The Law Society Digital and Brand team | Tom Chapman | Tom Ellen | Tony Roe | Tracey Calvert | Umar Kankiya | Vanessa Friend | Vicki Butler | Vidisha Joshi | William Li | William McSweeney