- My LS
Using algorithms to deliver justice – bias or boost?
The growing use of algorithms in the justice system raises many questions with few easy answers, the Law Society of England and Wales said as it launched a public policy commission to explore the impact of technology and data on human rights and justice.
Law Society vice president and commissioner Christina Blacklaws said: “Big data and algorithms already augment human capabilities for analysis and prediction beyond anything previous generations could have imagined.
“Their use could - and sometimes does - keep us safer, preserve scarce resources and expand the reach of increasingly stretched law enforcement.
“But the design, sale and use of algorithms to deliver justice or maintain security also raises questions about unconscious bias, ethics and rights. Further potential risks may emerge when an algorithm is developed by a business focused on profit rather than by an organisation focused on delivering justice.”
The commission will initially look at the use of AI in legal practice and more broadly in society, by the police and prison services. For instance:
- Durham Constabulary have used an artificial intelligence system to inform decisions about whether to keep a suspect in custody. They use an algorithm to assess low, medium and high risk of reoffending - so that arrestees forecast as a moderate risk can be made eligible for a programme designed to reduce re-offending
- Mathematicians and social scientists in the US have developed a crime prediction tool in collaboration with the Los Angeles Police Department. ‘PredPol’ has now been used by Kent Police for crime prediction hotspot mapping
- The Metropolitan Police and South Wales Police use facial recognition technology at public events, music festivals and demonstrations to cross-reference people already on watch-lists.
Christina Blacklaws added: “The questions we will explore include: What are the financial and social costs if algorithms are skewed? When is the use of algorithms and big data appropriate? What kind of oversight do we need? How do we ensure that the data used is correct and free from bias? And, how do we ensure decisions can be accessible, reviewed or appealed?”
The Law Society commissioners will take oral and written evidence from tech, government, academics, commercial actors and legal and human rights experts to explore an overarching question: what framework for the use of big data and algorithms could protect human rights and trust in the justice system?
The Law Society’s Public Policy Commission launch is on 14 June, 5-7pm at the Law Society.
Notes to editors
Christina Blacklaws is vice president of the Law Society of England and Wales. She has developed and managed law firms, including a virtual law firm, and she was the director of innovation at top 100 firm Cripps LLP.
Sylvie Delacroix is a professor in law and ethics at the University of Birmingham Law School. She takes a particular interest in machine learning and agency and has recently been researching the design of both decision-support and `autonomous’ systems, and on the effect of personalised profiling and ambient computing on ethical agency.
Sofia Olhede is a professor at the Department of Statistical Science, an honorary professor at the department of Computer Science, and also an honorary research associate at the department of Mathematics at University College London. Sofia has contributed to the study of stochastic processes; time series, random fields and networks.
About the Law Society
The Law Society is the independent professional body that works globally to support and represent solicitors, promoting the highest professional standards, the public interest and the rule of law.
Press office contact: Harriet Beaumont | email@example.com | 020 8049 3854