The ICO exists to empower you through information.

We found that the use of neurodata is growing and that the short-term horizon will be a critical period for the emergence and potential uptake of neurotechnologies in many sectors. Research and medical uses of neurotechnologies are relatively advanced. However, several other sectors are expected to expand significantly. Some sectors remain beyond our scope in this report, such as the military uses of neurotechnology. Therefore, we have identified the following sectors where we anticipate that neurotechnology may have a major impact on UK markets on the near horizon (two-seven years):

  • The health and medical research sectors will continue to deploy invasive and non-invasive neurotechnologies for the treatment of an increasing variety of physical and mental conditions.
  • The wellbeing and sports sector may use neurotechnologies to track and / or modulate sleep, moods and productivity.
  • Even as employee tracking expands, the workplace may routinely deploy neurotechnology for safety, productivity and recruitment. Recruitment may be a particular ethical and legal concern.
  • Neurotechnology will expand within the entertainment and gaming sectors to offer single player experiences linked to VR or AR and drone use.
  • Neurodata may be increasingly used in the marketing sector to develop personalised recommendations.
  • Applications for military and educational uses will develop further, but these are beyond the current horizon of this report, in part due to ethical and technological challenges.

It is also helpful to explore what the deployment of neurotechnologies within these sectors may (and do) actually look like from a data protection perspective, before examining the issues they could potentially raise.

Please note that these scenarios are intended to explore in brief some possible developments and uses of technology. While the scenarios include high level commentary on relevant data protection compliance issues, you should not interpret this as confirmation that the relevant processing is either desirable or legally compliant. This document does not provide ICO guidance.

In the short term (two-three years), the following sectors are where neurotechnologies are likely to have the greatest impact:

The medical sector

This is an area that is likely to see increased uses of invasive (surgically implanted) neurotechnologies. This may include implanted devices designed to deliver applications including direct brain stimulation (DBS) to treat epilepsy and Parkinson’s disease.8 Other applications may focus on physical needs, with the development of advanced neuroprosthetics. For example, retinal implants that seek to provide visual information directly to the brain, or spinal implants that may assist in recovering mobility. As these potential uses will use special category health data, significant protections under UK GDPR will be in place about processing this information. The presence and influence of other regulatory bodies, such as the Medicines and Healthcare Regulations Authority (MHRA) and General Medical Council (GMC), will also increase the regulatory oversight of this potentially high risk area of processing.

New medical data: speech decoding

In the longer term, medical neurotechnologies may enable new forms of assistive communication. Research into cortical implants is already being explored as a way of electronic communication. Beyond the need to appropriately handle special category data appropriately, these technologies are likely to support people in highly vulnerable situations. While there are key benefits to these technologies, there are also risks.  Speech translation, for example, presents the potential to misrepresent what a person has said, or to reveal thoughts that might otherwise have been private or meant to be edited before sharing. In both cases highly sensitive information could be revealed with no way to recover it, leaving a person at significant risk from mental and emotional harm.

Neuromodulation may be used as a treatment for people dealing with addiction or complex psychological needs for which no current treatment is available or has been successful. Designed to impact a person’s behaviour on a long-term basis, this may present risks to both accuracy and fairness. If neurodata is inaccurately gathered or interpreted due to issues with a device or algorithmic bias, it may lead to significant harms due to inappropriate treatment or even treatment being withheld.

Non-medical data that provides medical insight

It is also likely that non-invasive wearable neural devices will become increasingly accessible to consumers. These will have the ability to gather, and potentially share, increasingly granular data with healthcare providers. These devices may be marketed as wellbeing and fitness trackers. By sharing this data with healthcare providers, this may allow both physical and psychological medical care to become further personalised. While this offers the opportunity for targeted and cost-effective treatment, it also raises the prospect of complex data sharing. This could lead to challenges around transparency and access to data-driven decisions, and an increased pressure to repurpose data for research purposes. While the distinction between physical and mental data may remain highly debated, the key privacy categorisation remains whether or not the information is special category personal data or personal data.

In addition, the broader wellbeing sector may develop further with consumer targeted devices. These could be used to monitor a user’s mood, responses, and even to modulate neural patterns on a more general basis than the medical devices described above. This is likely to blur the line between wellbeing and health devices. In turn, it may alter the category of data from personal (consumer) data to special category (health) data and therefore mean different requirements for processing.9

Wellbeing neurotechnologies are likely to raise issues of transparency and could present possible complex inferences that people may interpret as medical advice. Issues of automated processing via closed loop devices may also leave people uncertain about how to exercise their information rights. We explore these challenges further in the ‘Issues’ section.

Professional sports

This sector may also see increased uses of non-invasive neurotechnologies outside medical treatment. Initially, organisations may use devices and neurodata to analyse professional athletes’ responses to stimulus and concentration levels.  They could also be used to track concussive injuries and their long-term effects. However, significant questions already exist about the ownership and use of such information in professional sports as players’ contracts are sold and values negotiated before considering issue such as appropriate use and purpose of information.

In the longer term, uses may move towards devices seeking neuroenhancement by improving reaction times and muscular response to neural activity; potentially allowing athletes to run faster, jump higher and throw further. These uses raise significant ethical and social concerns, as possible precursors towards broader public neuroenhancement that go well beyond the scope of data protection legislation, such as the UK GDPR. However, under the UK GDPR issues of fairness and transparency, as well as appropriate lawful bases for processing, will continue to be relevant to personal information collected using such novel technologies.

In the medium term (four-five years):

Future of work: employer access or ownership of neurodata

Future human resources departments may be faced with another task; processing neurodata. In all the scenarios below, organisations need to consider compliance with all applicable data protection rules, including:

  • their lawful bases for processing;
  • any power imbalance between employer and employee; and
  • the need for clear purpose limitation.

Workplace safety

The employment sector is likely to make increasing use of non-invasive neurotechnology to measure, record and process a variety of personal information.10 While employee monitoring is already a contentious area of processing, EEG systems may be integrated as part of a health and safety or risk management scheme. This could see helmets or safety equipment that measure the attention and focus of an employee rolled out in high risk environments. For example, around heavy machinery or a large vehicle, especially combined with long working hours.

Workplace wellness

Our research has indicated that employee monitoring with the stated purpose of enhancing and enabling workplace wellness within the office environment is already being explored. Wearable neurotechnologies are being worn by employees to help them and their employers have greater awareness of employee engagement and stress. However, biometric based monitoring technologies, such as gaze and gait tracking, may be perceived as a cheaper, more accurate and easier-to-deploy alternative.

Employee hiring

Finally, workplaces could see increased use of neurodata recording techniques as part of the recruitment process. This will aid organisations who want to identify people who fit desirable patterns of behaviour or perceived traits, like executive function.11 Research that combines biometric measures and organisational psychology has been called by some ‘neuromanagement’.12 

Workplace use of neurotechnology presents numerous risks and challenges for data protection. Conclusions drawn from information may be based in highly contested definitions and scientific analysis of traits, as we explore in the below section on regulatory issues. They may embed systemic bias in the processing, discriminating against those who are neurodivergent. Finding an appropriate basis for processing is likely to be complex and organisation will need to consider fairness, transparency and data retention.

Consumer data from the gaming industry

The entertainment sector has already begun to use neurotechnology for home entertainment. Games now exist allowing a player to remotely control drones via read-only neurotechnology that analyses and interprets information from the player’s brain. While these may make limited use of sensitive personal information due to the inputs required, it nevertheless may be more likely to increase the risk of excessive information collection and retention. 

Neurodata-led gaming is likely to emerge rapidly in the medium term. There is the possibility that single-player games will develop in highly limited formats with basic gameplay. But the challenge facing these will be meeting customer expectations when players are used to complex, high fidelity and online systems, as well as the additional costs of specialised equipment. Neurodata-led games at this stage are likely to focus on simple puzzle mechanisms rather than a sharing of neurodata between participants. Alternatively, other EEG based entertainment devices may focus on the control of devices such as drones, offering hands-free control of the device. Key data protection challenges for organisations offering consumer neurotechnology will lie in providing clear, intelligible descriptions of complex information gathering and automated processing, as well as ensuring that people’s information rights are accessible and implementable.

While the development of read-based neurotechnologies is likely to be limited in the medium term, there may be a significant uptake in the use of modulating technologies aimed at gamers. These devices may claim to boost response times and improve player’s concentration and multi-tasking capacity. Given the size of the professional gaming economy, this is likely to generate questions of fairness and competitiveness. In addition, data protection concerns will remain about how this information is held and analysed, and what risks may be posed should people choose to share it without fully understanding its potential uses and inferences.

In the long term (five-seven years):

Student neurodata

The increasing enthusiasm for integrating neuroscience into the design of educational programming has more recently included wearable neurotechnologies for children. Initial uses have received mixed receptions, including the termination of a project where the public demanded the removal of wearable brain monitoring devices of children in China.13 The higher education sector may seek to make use of wearable BCIs, such as EEGs, to measure students’ concentration levels and stress levels, as well as offering neuromodulation of cognitive processes to boost student performance. These technologies are likely to build on those already developed for the wellbeing sector. They may use different software interfaces and far more long-term tracking of information linked to academic performance. Devices may offer increasingly personalised approaches to learning; highlighting areas where students excel or struggle.

The expected delay in deploying neurotechnologies to the education sector is likely to be based on ethical concerns, rather than technological barriers. In particular, whether there should be any attempts to use the technology for children, rather than adult students. Issues of consent, financial accessibility and potential discrimination are likely to be critical in developing appropriate uses of neurotechnologies within education settings.

Consumer insights data

Another area that may see initial market development in the medium term is direct to consumer neuromarketing. Neuromarketing is a well-established practice of market researchers who utilise information about recordings from the brain to determine product development and advertising within tightly controlled environments. 

In the future, non-invasive devices capable of reading responses may be used at home to tailor consumer preferences. This could include neurotechnology-enabled headphones that might target advertising and commercials of a variety of goods, similar to cookie-enabled tracking online. This can be used to populate new tailored responses based on people’s use of search engines. Alternatively, these technologies may integrate with virtual reality devices, seeking to tailor advertising in virtual environments.

However, these approaches may remain on the fringes of the market due to the following factors:

  • They require a significant investment of time by users to generate accurate information for relatively modest returns on recommendations.
  • Users may face issues of accuracy and a lack of transparency, given the potential for opaque algorithmic systems to be implemented.
  • Outputs and decisions may be presented in a way that make it difficult for customers to understand how their personal information has been used.
  • Issues of consent and privacy may arise in shared spaces, where selected adverts may inadvertently reveal sensitive information about a person. This may potentially occur in a private space, on a smart TV, or in a public space with devices such as a smart advertising board, or even in a shared digital space such as the metaverse.

8 The latter approach is considered a high-risk procedure by some and has been seen to cause impulsive behaviour which may limit its uptake on the near horizon.

9 Lawful basis for processing | ICO

10 Given the associated risks and relatively early-stage development, it is highly unlikely that invasive BCIs would be used in an employment context.

11 https://www.sciencedirect.com/science/article/abs/pii/S0167923623000052

12 Frontiers | Job Assessment Through Bioelectrical Measures: A Neuromanagement Perspective (frontiersin.org)

13 Brainwave-tracking start-up BrainCo in controversy over tests on Chinese schoolchildren | South China Morning Post (scmp.com)