So, what’s the big deal?
The Government says that the technology will help patients, especially the blind and those who cannot access the internet through traditional means, to get ‘professional, NHS-verified health information in seconds, through simple voice commands.’ The Government believes that the technology has the potential to reduce the pressure on the NHS, GPs and pharmacists by providing information about common illnesses. This all sounds good, but will it work in practice?
The technology will stream the NHS’s advice that already exists online through Alexa, using voice. So, you could say ‘Alexa, what are the symptoms of flu?’ and Alexa will tell you the symptoms from the NHS website, and could give you further advice on how to treat yourself. I’m trying it now (unsure if the service is fully operational) – Alexa credits the NHS website and gives you a list of symptoms, says this information isn’t medical advice and may not be up-to-date or accurate, and to ‘consult a doctor if you have a medical problem or you are seeking advice for your or someone else’s personal situation, health, or medical problem.’
Empowerment and NHS resources
In terms of empowering patients who are blind, or who cannot access the internet in traditional means, this could certainly help them to lead more independent lives. However, they must first be able to access the technology in order to benefit from it. First and foremost, it requires the user to purchase an Alexa-enabled device, either online or at a store. Already, the technology is only available to those who can afford it. Furthermore, it requires at least a basic understanding of how to use the technology. Those of us reading this on our computers or phones may take this for granted, but it’s important to keep in mind that technology isn’t intuitive for everyone.
The partnership is being heavily marketed as giving people access to expert advice through Alexa, but most people can already access this information through the NHS website. This could be misleading – there’s the potential for people to believe that Alexa is providing expert advice itself, or that the system is giving them a diagnosis, when it is really just using an algorithm to identify relevant and read NHS webpages. People may not feel the need to visit their GP in cases where they should.
While there is the chance that someone with a cold symptom or symptoms could ask Alexa for advice instead of visiting their GP and leave it at that, there is also the possibility that Alexa’s algorithm could lead to more worrying in others by matching up symptoms with more serious illnesses, causing them to visit their GP or A&E when they otherwise wouldn’t have.
Data and privacy
Critics have raised data protection concerns about the partnership. Amazon has said that the information will remain ‘confidential’, but hasn’t released any further details on how data will be used, stored or accessed. In our 2015 report, The collection, linking and use of data in biomedical research and health care: ethical issues, we recommend that the use of data in biomedical research and health care should be in accordance with a publicly statable (capable of being articulated in a way that is meaningful, and understandable to those with interests at stake) set of morally reasonable expectations and subject to appropriate governance. We set out four principles for the ethical governance of data initiatives:
- Respect for persons: the terms of any data initiative must take into account both private and public interests. Enabling those with relevant interests to have a say in how their data are used and telling them how they are, in fact, used in a way in which data initiatives can demonstrate respect for persons.
- Respect for human rights: the terms of any data initiative should respect people’s basic human rights. This includes limitations on the power of states and others to interfere with the privacy of individual citizens in the public interest.
- Participation of those with morally relevant interests: decision makers should not merely imagine how people ought to expect their data to be used, but should take steps to discover how people do, in fact, expect their data to be used, and engage with those expectations.
- Accounting for decisions: data initiatives should include formal accountability, through regulatory, judicial and political procedures, as well as social accountability through periodic engagement with a broader public, as a way of re-calibrating expectations. Data initiatives must tell affected people what will be done with their data, and must report what actually has been done, including clear reports of any security breaches or other departures from the established policy.
While (so far), the NHS-Amazon partnership isn’t technically a data initiative as defined in our report [projects must involve one or more of the following practices: 1) where data collected or produced in one context, or for one purpose, are re-used in another context or for another purpose, and 2) where data from one source are linked with data from a different source, or many different sources)], it still is possible that it will be dealing with potentially sensitive data, for example, if someone asks ‘Alexa, what are treatment options for cancer?’ and this is saved alongside information about that person that would make them identifiable. Google searches have been used as evidence before, so certainly Alexa queries could fall under that category. Therefore, our ethical principles are still relevant.
medConfidential has sent through a Freedom of Information request to the DHSC, requesting a copy of the agreement between Amazon and the DHSC. It’s crucial for the UK Government and Amazon to be transparent about their agreement; what information they are saving; and how that information is being used, stored and accessed. They must also indicate how they have engaged with, and will continue to engage with, those with relevant interests – such as stakeholders and potential users – to ensure that reasonable expectations are identified and met.
Perhaps it’s the Amazon partnership in particular that’s raising a few eyebrows. Earlier this year, Bloomberg broke a story that Amazon employs thousands of people across the world to listen to voice recordings captured by Alexa. The recordings are transcribed, annotated, and then fed back into the software as part of an effort to make Alexa better. If Amazon intends to use the information in a similar way, it must be articulated in a way that is understandable to those whose data will be collected, and possible to easily opt-out of this use. This is the same for any partnership the Government plans on entering – they’ve said that they are currently working with other tech companies to help users better access NHS health information.
As with most technologies, some might find this beneficial and others might not, but if the Government wants this partnership – or any healthcare partnership – to be a success, they must be transparent about the partnership agreement and engage with those with relevant interests to decide on reasonable expectations around data use, and ensure that these are being met. People will want to know what data are being used, and how their data are being used, stored, and accessed and by whom. Without trust in these systems, it is hard to see how it will become a success.
There are so many unexplored ramifications for this situation, which this article has highlighted so comprhensibly.It is dificult enough to fully define the risks and implications of using electronic communication with GP pactices and NHS services, but to deliver your symptoms and receive a diagnosis via a gimicky household device sounds highly risky and dangerous.