You are here:
  1. Home
  2. News
  3. Blog
  4. Smile you’re on camera – the pros and cons of facial recognition technology

Smile you’re on camera – the pros and cons of facial recognition technology

17 June 2019

William McSweeney is the Law Society’s technology and law policy adviser, working on the impact of key emerging technologies on the law.

 

Facial recognition technology is gaining a foothold in many industries, and over the years to come, could affect us all. But where will we see the benefits, and what are the issues to look out for from a legal and ethical perspective?

The positives

AI and facial recognition mean that a photo can be quickly scanned against thousands of others for a possible match, sourced from everywhere from border checkpoints to human trafficking investigations. It can also predict a change of appearance over time. This has already been used to staggering effect in India, where 3,000 missing children were located in just four days, after photos provided by parents were compared with those taken in Child Help Centres around the country.

Many genetic disorders come with subtle facial traits, and algorithms are beginning to identify these with a simple facial scan. More sophisticated trials have even managed to pinpoint the specific genetic mutation that has caused a particular syndrome. The hope is that this technology eventually rivals traditional genetic testing in terms of accuracy, while besting it on both speed and cost.

The 'internet of things'

The Internet of Things (IoT), an increasingly common catch-all phrase to explain the extension of internet connectivity into everyday objects, is becoming ever more available in our homes, from doorbells and security cameras to fridges and thermostats. Essentially, commonplace domestic devices can now be 'smart'.

The Internet of Things can make our lives easier and more convenient in some ways, but with it comes the increased ability to create, store and share new data, leaving it vulnerable to hacking. With stories of security systems compromised by lightbulbs, to coffee machines being used to infect computers with ransomware, the possibilities may seem absurd, but should we be taking them seriously? Some IoT devices, such as smart doorbells, harness facial recognition technology, and there are many concerns surrounding its use and application.

Facial recognition technology – the concerns

The use of facial recognition technology raises questions around efficiency, bias, impact on human rights, and legislative basis. Facial recognition technology has the ability to militarise policing in public spaces. It has been used in other countries to target vulnerable communities and curtail legal and legitimate protest.

Campaigners, including Liberty UK and Big Brother Watch, have stressed that mass surveillance of innocent people in public violates three articles of the Human Rights Convention, article 10: The Right to Freedom of Expression; article 11: The Right to Freedom of Assembly and Association; and article 8: The Right to a Private Life. There is a real worry that the indiscriminate use of facial recognition technology in the public realm stifles non-conformist modes of appearance and expression. Facial recognition technologies and their use have normalised pervasive surveillance practices in public spaces and, in doing so, has undermined several inalienable rights.

Facial recognition software is beginning to be used by police forces around the UK, yet the number of false positives remains high in many use cases. When used during the UEFA Champions League Final 2017 to identify those who had previously caused trouble, 92% of matches were incorrect, with only 173 people correctly identified. In addition, human checks and balances are not as common when computer-generated decision-making is accepted as accurate, with limited or no human oversight.

Facial recognition software learns from the data that it is trained with, and because of the profiles of the people developing this technology, there is a bias towards using Caucasian male subjects. This leads to low identification rates of females and anyone with darker skin, with one trial misidentifying darker skinned women for men 31% of the time. The US Government Accountability Office, in March 2017, found that these technologies were 15% less accurate for women and ethnic minorities. When this technology is being used in high stakes scenarios, such as identifying criminal suspects, it is not difficult to see how this could perpetuate an existing racial bias within the law.

While biometric identification requires you to consciously submit something to match, such as an iris or fingerprint scan, facial recognition software merely needs to capture an image of your face – which can be can done as you walk about in public spaces. This can result in a form of 'perpetual line-up' where our images are constantly being matched with potential criminals.

In February 2017, the government gave unconvicted individuals the right to ask police forces to delete their images from custody image database. A year later, 67 applications for deletion had been made, with only 34 successful (Press Association Investigation, figures from 37 out of 43 police forces in England and Wales, obtained following FoI requests, 2018). This suggests that the current method for storing and process for deleting data is not fit for purpose.

The open nature of this kind of technology has consequences far beyond law enforcement and other regulated bodies. Invasion of privacy can have a serious effect when it's committed by a company, but it can have a devastating one if it's used by individuals to commit crimes like stalking or harassment. It remains to be seen if lawmakers around the world can keep up with the speed of innovation from the likes of Amazon and other large product developers. 

Any facial recognition technologies which are developed for the use of, or by, public agencies should be open, transparent and accountable. It is vital the design, development and deployment stages in technological innovation and adoption should be open to both internal and external evaluators to validate the results independently. Technology should only be deployed when it:

  • complies with data protection and human rights laws, ethical considerations, and administrative law
  • is tied directly to a long-standing and ethical policy
  • operates in line with its initial problem statement.

 

Views expressed in our blogs are those of the authors and do not necessarily reflect those of the Law Society.

Read more and download the Algorithms in the justice system report

Explore our map showing what we know today about where complex algorithms are being used in the justice system in England and Wales.

Law Society members save 12.5% on Hiscox Home Insurance policies and 5% on cyber insurance*. *Terms and conditions apply. To find out more and see the full terms and conditions visit www.hiscox.co.uk/lawsociety. The Law Society is an Introducer Appointed Representative of Hiscox Underwriting Ltd which is authorised and regulated by the Financial Conduct Authority. For UK residents only.

Visit the Technology and Law Policy Commission for details of the experts and to watch the evidence sessions

Read blogs by the co-commissioners  Professor Sofia Olhede: Can algorithms ever be fair?

Professor  Sylvie Delacroix blogs: Will data + algorithms change what we can expect from law (and lawyers)? And Ask an AI: what makes lawyers "professional"?

Our Lawtech Report highlights key developments and what this means for the work of the profession and the business of law

Read Rafie Faruq CEO of Genie AI on 10 factors to consider before procuring legal AI

Tags: equality | human rights | technology | artificial intelligence

About the author

William McSweeney is the Law Society’s Technology and Law Policy Adviser, working on the impact of key emerging technologies on the law.

  • Share this page:
Authors

Adam Johnson | Adele Edwin-Lamerton | Ahmed Aydeed | Alex Barr | Alex Heshmaty | Alexa Lemzy | Alexandra Cardenas | Amanda Adeola | Amanda Carpenter | Amanda Jardine Viner | Amy Bell | Amy Heading | an anonymous sole practitioner | Andrew Kidd | Andrew McWhir | Andy Harris | Anna Drozd | Annaliese Fiehn | Anne Morris | Anne Waldron | anonymous female solicitor | Asif Afridi and Roseanne Russell | Bansi Desai | Barbara Whitehorne | Barry Wilkinson | Becky Baker | Ben Hollom | Bhavisha Mistry | Bob Nightingale | Bridget Garrood | Caroline Marlow | Caroline Roddis | Caroline Sorbier | Carolyn Pepper | Catherine Dixon | Chris Claxton-Shirley | Christina Blacklaws | Ciaran Fenton | CV Library | Daniel Matchett | Daphne Perry | David Gilroy | David Yeoward | Douglas McPherson | Duncan Wood | Elijah Granet | Elizabeth Rimmer | Emily Miller | Emily Powell | Emma Maule | Floyd Porter | Gary Richards | Gary Rycroft | Graham Murphy | Gustavo Bussmann | Hayley Stewart | Hilda-Georgina Kwafo-Akoto | Ignasi Guardans | James Castro Edwards | Jane Cassell | Jayne Willetts | Jeremy Miles | Jerry Garvey | Jessie Barwick | Joe Egan | Jonathan Andrews | Jonathan Fisher | Jonathan Smithers | Jonathon Bray | Julian Hall | Julie Ashdown | Julie Nicholds | June Venters | Justin Rourke | Karen Jackson | Kate Adam | Katherine Cousins | Kaweh Beheshtizadeh | Kayleigh Leonie | Keiley Ann Broadhead | Kerrie Fuller | Kevin Hood | Kevin Poulter | Larry Cattle | Laura Bee | Laura Devine | Laura Uberoi | Leah Glover and Julie Ashdown | Leanne Yendell | Lee Moore | LHS Solicitors | Linden Thomas | Lucy Parker | Maria Shahid | Marjorie Creek | Mark Carver | Mark Leiser | Markus Coleman | Martin Barnes | Mary Doyle | Matt Oliver | Matthew Still | Max Rossiter | Melissa Hardee | Michael Henson-Webb | Neil Ford | Nick Denys | Nick O'Neill | Nick Podd | Nikki Alderson | Oz Alashe | Patrick Wolfe | Paul Rogerson | Pearl Moses | Penny Owston | Peter Wright | Philippa Southwell | Preetha Gopalan | Prof Sylvie Delacroix | Rachel Brushfield | Rafie Faruq | Ranjit Uppal | Ravi Naik | Remy Mohamed | Richard Collier | Richard Coulthard | Richard Heinrich | Richard Mabey | Richard Messingham | Richard Miller | Richard Roberts | Rita Gupta | Rob Cope | Robert Bourns | Robin Charrot | Rosa Coleman | Rosy Rourke | Sachin Nair | Saida Bello | Sally Azarmi | Sally Woolston | Sam De Silva | Sara Chandler | Sarah Austin | Sarah Crowe | Sarah Henchoz | Sarah Smith | Shereen Semnani | Shirin Marker | Siddique Patel | Simon Day | Sofia Olhede | Sonia Aman | Sophia Adams Bhatti | Sophie O'Neill-Hanson | Steve Deutsch | Steve Thompson | Stuart Poole-Robb | Sue James | Susan Kench | Suzanne Gallagher | The Law Society Digital and Brand team | Tom Chapman | Tom Ellen | Tony Roe | Tracey Calvert | Umar Kankiya | Vanessa Friend | Vicki Butler | Vidisha Joshi | William Li | William McSweeney