In their recent statement on digital contact tracing, NHSX, the NHS Unit leading digital transformation of the NHS, acknowledge that the public will need to trust this new technology for it to be effective in tackling COVID-19, and that NHSX will seek to earn that trust by working with ‘transparent standards of privacy, security and ethics’.
(Read the NHSX statement)
This app has the potential to transform lives, but its development and use raises a range of questions, some of which are outlined below:
The organisations involved and their interests
1. How much is the development of the app going to cost the NHS?
2. Apple and Google have been involved in the development of the app. Are any other tech companies involved? How did they become involved? Why are they participating? Will they make a profit from participation?
3. Which companies and government agencies will come into contact with what kinds of data collected by the app and what will they do with that data? How will the data be used for research, who by, how will profits arising from the commercialisation of research be shared?
4. Will the development team at the University of Oxford receive any funding for their work on the app? What was the process through which they became involved alongside the other participants in development? Was there a tendering process?
Experts and representatives consulted on ethical and social issues
5. The NHSX is ‘committed to listening to [our] ideas and concerns’. How will the developers access a broad range of ideas, including from beyond traditional areas of expertise to inform their work? Will their listening be ‘active’ i.e. result in meaningful change?
6. Who has been consulted about the social and ethical implications of the app? What disciplines and personal backgrounds do these experts come from? Are they diverse?
7. Have experts on the social and ethical aspects of contact tracing apps from other countries where these have been used (e.g. South Korea) been consulted in the development of the UK app?
8. Security and privacy designs and source code will be published so experts can check this. How will these experts be identified? How will their findings be used? Will details of ethical protocols, oversight and considerations also be released for external scrutiny by which kind of experts and how will their conclusions be taken forward?
9. Have patients and publics have been consulted? What were they told about the app and what did they say? What kinds of patients and publics were involved? Were they diverse, what voices were missing and how will this be addressed? Will this group be able to have further involvement in oversight as the app is rolled out? How will representatives of communities where uptake is likely to be or turns out to be low be included in these processes of evaluation?
Other evidence & learning from other areas
10. Has the performance and social and ethical aspects of digital contact tracing apps in other countries (e.g. South Korea) been considered and plans put in place to mitigate relevant risks and negative consequences?
11. How has learning from previous NHS data breaches and failures with roll out of new technologies/infrastructure, including PPE distribution, been applied in the development of this app?
Integration with existing arrangements
12. How will the app integrate with other forms of contact tracing? Will information be shared between them?
13. If physical testing infrastructure is not optimal, which is has not been since the start of this crisis, how will the app be effective?
14. What counselling and support from appropriate professionals will be available to participants identified as ‘at risk’? How will these professionals be involved in monitoring and reporting breaches of compliance, to whom?
Mitigating risks and negative effects
15. How many people will have to download the app for it to be effective? How will the benefits of participation be realised without the creation of unreasonable social pressures to participate which would undermine the voluntariness of participation?
16. What though has been given to the dangers of discrimination arising from the use of the app and how will this be mitigated? What information, evidence and analysis has this involved?
17. How will errors in data, connections or risks identified by the app be identified and rectified?
18. The app will ‘advise [individuals] to self-isolate if necessary’. What will happen if individuals choose to flout this advice, how will the app collect and share data about non-compliance, will any actions be taken by the authorities?
19. The statement ‘Millions of us are going to have to trust the app and follow the advice it provides’ suggests individuals will have to put doubts to one side and to be compliant with advice for the app to be effective. How will developers and providers of the app ensure injunctions that we ‘have to trust’ and ‘follow the advice’ do not precipitate forms of social censorship, denouncement and condemnation of those who do not act as required, particularly already marginalised groups?
20. How will the app reach areas/communities where uptake is likely to be poor? Will this involve targeted communications and/or any particular inducements? Will rates of participation be published and subject to scrutiny/comment in the public domain and might this lead to stigmatisation of particular areas/ communities?
Being transparent involves answering questions as part of dialogue and engagement with interested others. On matters of privacy, security and ethics there are many different kinds of patients, publics and other experts with a stake in these discussions.
The Nuffield Council on Bioethics is working with a range of partners on this and related topics. We recently held a joint webinar with the Ada Lovelace Institute - Beyond the exit strategy: ethical uses of data-driven technology in the fight against COVID-19 and the Institute recently published a rapid evidence review of the technical considerations and societal implications of using technology to transition from the COVID-19 crisis.