- My LS
Andrew McWhir looks at the COVID-19 tracking app and how it can face up to concerns over privacy.
As the weather improves and governments get increasingly anxious about the effects of lockdown on communities in the UK and elsewhere, we’re seeing the first tentative steps towards reopening societies in Europe, Asia, and the Americas.
Central to this discussion in the UK is the launch of the coronavirus (COVID-19) contact tracing app, currently being trialled on the Isle of Wight. It’s a vital piece of the jigsaw of the government’s wider test and trace programme for managing the pandemic and safely reducing current physical distancing requirements.
The development of the app is led by NHSX, responsible for delivering the digital transformation of the NHS and social care, and comprised of clinicians, technologists and data scientists.
In developing the app, they’ve been joined by cybersecurity specialists at the UK’s National Cyber Security Centre. Should this reassure us that the cybersecurity and data privacy fundamentals of the app are robust? Probably.
Led by its technical director, Dr Ian Levy, the National Cyber Security Centre (NCSC) has been advising on best practice throughout the development of the app.
Ultimately, as Dr Levy points out in an excellent blog explaining how the app works, and the trade-offs that need to be made between providing the best information possible for managing the disease, and risks to the privacy of personal information, striking a balance between individual, group, and national privacy is fundamental to the app’s functionality and performance.
As well as ensuring that the app will run effectively on a wide range of supported devices, and won’t drain users’ batteries or stop other apps working, Levy assures us that its design "strongly protects" privacy and security.
A major talking point to date about the UK’s approach to developing its tracing app has been over whether information from personal devices should be uploaded to a central database which will then push notifications with clinical advice to those who may have had high-risk contact with an unwell user.
A decentralised system, on the other hand, would manage a notification system without providing the NHS with any clinically relevant information about how these contacts may be spreading the virus.
Levy carefully explains the technical operation of the proposed app, and importantly relates the cryptography to the steps in health policy intervention that would kick in once users upload their proximity events to an NHS server.
Combining the anonymous identifiers used by the app with a back end built to NHS data security standards for protecting personal data and limiting access to it, should, Levy says, reassure the public.
If the app is used to link a user to a clinical test, that would be done through a privacy preserving gateway, he maintains, which wouldn’t link a device installation ID to a person’s identity or NHS record.
An individual with malevolent intent would need access to each element in order to link to an identifiable individual, an outcome which has been designed out, he assures us.
In a nutshell, Dr Levy and others argue strongly that the benefits to disease management, combined with the protections designed into the app, outweigh the risks to data privacy of a centralised system.
Not everyone is convinced, however. Parliament’s Joint Committee on Human Rights has warned that the current trial for implementing the app presents serious risks to privacy and human rights.
In particular, the committee wants new legislation with guaranteed protections for data and human rights, an independent digital contact-tracing rights commissioner to provide oversight (it’s unclear how this role would sit alongside that of the UK’s Information Commissioner), and regular reviews of the app’s use by the Health Secretary.
Similar concerns have been echoed by MPs and peers from all sides of Parliament, demonstrating the scale of the challenge the government faces in inspiring public confidence in the app.
To further underline its recognition of legitimate privacy concerns and the need for transparency, the government has formed an NHS COVID-19 App Ethics Advisory Board to respond to ethical concerns, develop an ethics framework and a model of good practice for the project.
NHSX has also asked the Information Commissioner to conduct an informal review of its data protection impact assessment (DPIA) for the app’s current Isle of Wight trial and for a national roll-out, and that review is currently underway.
For law firms considering a return, in whole or part, to traditional working, there will be a number of considerations to consider, in particular to personnel and client interaction in the office, and, potentially, relating to legal professional privilege.
The Law Society is monitoring the roll-out of the app, and discussing these and other issues with the SRA, and we will update information and advice on next steps in coming days.