Technology is an integral and essential part of everyday life. We are so accustomed and rely heavily on the smart phones that we carry wherever we go, or the ubiquitous interactive digital assistants (Siri, Alexa, Cortana and Google) that control an ever-increasing number of everyday tasks.
Technology is increasingly found in the legal system and ‘lawtech’ is the phrase used by the Law Society to describe technologies that aim to support, supplement or replace traditional methods for delivering legal services, or that improve the way the justice system operates.
These tools include:
- document automation
- predictive artificial intelligence
- smart legal contracts
- knowledge management
- research systems
The Technology and Law Public Policy Commission published a report on Algorithm use in the Criminal Justice System on 4 June 2019. Algorithms currently used in the criminal justice system include facial recognition systems, DNA profiling, predictive crime mapping, mobile phone data extraction, data mining as well as social media intelligence.
The report concludes: “An uncritical reliance on technology could lead to wrong decisions that threaten human rights and undermine public trust in the justice system.” It went on to state that there are “consequences for personal dignity, such as loss of individuality and autonomy and human rights such as privacy and freedom from discrimination.”
The 2018 report Is Britain Fairer? by the Equality and Human Rights Commission (EHRC) found that fewer disabled people have confidence that the criminal justice system is effective. On 1 May 2019 the EHRC launched a legal inquiry asking whether people with mental health conditions, cognitive impairments and neuro-diverse conditions including autism and ADHD are experiencing discrimination and being put at risk of miscarriages of justice due to a lack of support in the criminal justice system.
David Issacs, the Commission’s chair commented: “Technology can often assist and empower disabled people, but we must also ensure it is used appropriately and doesn’t inadvertently end up isolating disabled people or jeopardising their ability to participate in person.
“If disabled people’s needs aren’t properly identified from the outset they are at risk of not understanding the charges they face, the advice they receive or the legal process. In some cases, this can mean disabled people could be wrongly convicted or receive inappropriate sentences.” The inquiry will conclude by the end of 2019.
Algorithms are already deployed in financial services, recruitment, local government, hospitals, schools, shops and gyms. They provide immense benefit in healthcare where lives can be saved with advanced technology; however, there is also the potential for injustice and unfairness.
A paper submitted for the Public Law Project session in October 2018 entitled Algorithms, apps & artificial intelligence: the next frontier in discrimination law? highlighted the dangers of automated software used by recruiters that reject applicants who were different to current successful employees. Unless the software can be tailored to include disability and/or differences, it will revert to selecting only the mean (non-disabled) candidate. Given that the disability pay gap still exists and that disabled people are more likely to be in low-pay occupations it is increasingly important that software used by recruiters is free from bias.
This corresponds to conclusions in the Technology and Law Public Policy Commission report: Machine learning (algorithmic) systems look for similarities and “do not allow individuals to proclaim their individuality”.
The report also recognises that people may readily derogate responsibility to a decision-support system, and that “human decision makers may lack the confidence and knowledge to question or override an algorithmic recommendation.”
Positively, recommendations include strengthening oversight of algorithms used in the criminal justice system, controlling procurement of algorithmic systems to ensure maximal control, amendment and public-facing transparency.
The Law Society has developed a comprehensive strategy to:
- encourage technological innovation in the legal sector
- support the growth of UK lawtech
- make sure the ethical issues surrounding lawtech are fully considered
- make sure lawyers have opportunities to learn about and use lawtech in their work
- promote electronic dispute resolution
This is achieved through TechTalks, lawtech research, partnership with Barclays Eagle Labs and the formation of the Lawtech Delivery Panel (the Panel consists of a team of industry experts and leading figures from government and the judiciary.)
Written by Antonio DiAngelo, Lawyers with Disabilities Committee member