Introduction
Mind reading; it sounds like science fiction, and it is, but the current reality (and near future) is just as fascinating. Implanted and wearable devices that analyse and even directly alter our brain patterns already can and will have a huge impact on medical treatments, the ways we work, and enjoy ourselves and will continue to do so. Unsurprisingly, interest in using and regulating neurotechnologies has grown significantly in recent years. This is set against a background of growing financial investment in a market estimated to be worth some 14 billion pounds by 2026. However, much of the current focus is on the social and ethical implications of these technologies, rather than the privacy challenges they present.
About neurotechnologies
Neurotech encompasses a wide variety of approaches but can be broadly defined as “devices and procedures that are used to access, investigate, assess, manipulate, and emulate the structure and function of neural systems”. This includes:
- medical devices from prosthetics and diagnostic devices, such as cochlear implants and MRIs;
- cutting-edge implants and wearable devices that may be used in medical settings, such as treating neurodegenerative conditions; and
- increasingly for commercial uses, such as workplace monitoring.
The personal information derived from these technologies, so-called “neural information”, can be described as “information relating to the functioning or structure of the human brain of an identified or identifiable individual that includes unique information about their physiology, health, or mental states”.
The delivery and use of neurotechnologies can be diverse in terms of devices and the ways in which they gather information. Not all of these directly present privacy concerns; the following are some of the key technological approaches and concepts of most interest to us:
- Implanted or semi-implanted devices are predominantly medical devices embedded into the brain, which offer a high degree of accuracy and granularity of information.
- Wearable devices are medical and consumer devices worn on the body, which offer cheaper, lower risk access to neurotechnology.
- Read-only devices are devices which only gather neural information for analysis, although outputs may directly feed into other devices such as VR headsets.
- Read and write devices are devices which gather neural information but also modulate or stimulate brain patterns directly, potentially affecting behaviour and individual’s responses.
- Closed loop processing is the automated processing of neural information that gathers and analyses personal information and feedbacks to either the person or device without meaningful human intervention. For example, a medical device that automatically stimulates a part of the brain to prevent a serious seizure.
State of development
The medical and healthcare sector remains at the forefront of neurotech development, with rapid innovations happening around devices used in treatment of epilepsy and spinal injuries, to name just a few. More advanced research on potential treatments for neurodegenerative conditions, such as Parkinson’s disease, as well as mental health therapies is proving promising. This may result in significant future uptake, if successful.
Other sectors that are likely to see market developments in the near future include:
- wellbeing and sports wearables designed to track and enhance sleep and physical performance;
- workplace deployment to track employee safety around heavy machinery, monitor attention levels, or possibly even as part of recruitment processes; and
- neuro-entertainment to enhance e-sport performance or develop games directly controlled or responsive to the user’s brain patterns.
These developments, as well the general uptake of emerging neurotechnologies, appear to be driven by a variety of factors:
- The increased affordability of sensors and a growing trend towards lightweight, efficient, portable and non-implanted devices.
- The increased sophistication of supporting technologies which enable rapid scaling, such as AI and machine learning, 5G and wireless connectivity and cloud storage.
- The global lack of neurotech specific regulation combined with the appetite for the ‘datafication’ of people may also drive commercial uptake.
In response to these developments, calls for specific neurorights are also emerging, although the need for these remains debated. Considerations of neurotechnologies have also directly impacted emerging legislation. This is most notable in Chile, where constitutional reform has embedded distinct limitations and obligations about the use of neurotechnologies.
Fictional future scenario
Alex is a football player in the middle of negotiating a dream transfer to a top Premier League team. Keen to get the very best analysis of their performance and fitness, Alex agrees to the collection and use of not just biological information like their heart rate, but also their neural information. Using wearable devices, the team can track their sleep patterns, ability to focus and response to physiotherapy. This is then fed into a program of training to enhance their performance.
However, as the contract discussions continue, it becomes clear that the club consider the neural information theirs. They see it as integral to a wider training program they’re developing and could give them a competitive edge. Alex is deeply uncomfortable with this. Surely the neural information is personal information and they should have access and some control? They’re uncertain and worried about the complexity of the technologies and information sharing. These concerns only grow when they learn the club has started working with a new insurance company, which plans to make use of neural information to rate and evaluate players. This doesn’t sound like anything they agreed to. Yet there are rumours that clubs can use the information to pinpoint when a player is no longer at their best. How can they challenge these practices and the potentially huge impact they might have on their career?
Data protection and privacy implications
The data protection risks posed by collecting and using neural information include the following:
- Processing of novel, highly sensitive information: The potential for organisations to collect large scale, complex neural information sets about a person presents increased security and privacy risks. This may allow organisations to draw detailed inferences about highly sensitive information, such as someone’s mental health or sexuality. A related challenge is understanding when neural information counts as special category information under the UK GDPR. There is no explicit definition of neural information under the legislation and unless used for identification or medical purposes, it is not considered special category information. Even when revealing sensitive information, such as workplace performance, neural information is likely to be considered personal information without the additional protections given to special category information. However, organisations should process it in line with the requirements of the UK GDPR. However, they should remain extremely careful in how (and why) they process such information.
- Consent and transparency: Neural information is subconsciously generated. People have no direct control over the specific information that is generated and shared through neurotech devices. This is likely to make the use of consent as a basis of processing challenging to achieve if people cannot be sure of what information they are being asked to provide and what it may reveal about them.
Neurotechnology’s potential to not only observe and collect neural information, but to modulate brain patterns and alter behaviour may inhibit transparency. This may fundamentally hinder the pursuit of rights under existing privacy legislation given the potential to inhibit a person’s ability to evaluate their own personal information. It may also raise far more fundamental questions about freedom of thought and personal identity. Organisations should embed privacy by design at the beginning of any technological or service development.
- Neurodiscrimination: neural information might be taken as a new path to discriminate against minority and marginalised communities, if inaccuracy and systematic bias remain embedded in technologies. This breaches the UK GDPR requirement for personal information to be processed fairly. It may also lead to discrimination against new categories of people based on their brain patterns who aren’t currently recognised under parallel legislation such as the UK Equalities Act.
- Regulatory and technical clarity: Further collaboration is key to understanding possible gaps in regulation or guidance and potential clashes of interest. This should be between privacy regulators and those regulators who oversee consumer protections, medical devices and other relevant areas. Developing technical clarity will ensure that regulators have functional knowledge of neurotechnologies. Agreed definitions to allow for effective regulation will also be important. A key area is how and when automated processing of non-medical neurodata may be appropriate and what meaningful intervention would look like in order to ensure that decisions are made fairly and transparently.
Recommendations and next steps
- We have already published a longer and more in-depth review of neurotechnologies and their privacy challenges; our ICO Tech Futures: Neurotechnologies report.
- We will continue to engage with key stakeholders across industry, regulation, academia and civil society. This will include inviting organisations to work with our Regulatory sandbox to engineer data protection into these technologies.
- Given the potential impact of neurotechnologies on the public, we will work with them to better understand their knowledge and concerns about neurotechnologies and privacy through public engagement activities.
- We will develop guidance in due course on data protection expectations for neurotechnology. This will help bring regulatory clarity and set clear expectations about the responsible and compliant use of neurodata.
Further reading
Regulatory Horizons Council’s report on Neurotechnology regulation
Royal Society’s report on neurotechnologies in healthcare - iHuman neural interfaces report
Future of Privacy Forum’s Privacy and the connected mind report
The Council of Europe’s report on Common human rights issues raised by applications of neurotechnologies in the biomedical fields
UNESCO's report on Ethical Issues of Neurotechnology
Knowledge Transfer Network’s (UKRI) report on A transformative roadmap for neurotechnology in the UK
Organisation for Economic Cooperation and Development’s (OECD) Recommendation on responsible innovation in neurotechnology