Executive summary
Introduction
Why neurotechnology and neurodata?
Defining neurodata and neurotechnology
Neurotechnologies: Key definitions
Sector scenarios
Regulatory issues
Next steps
Annex A: Market size
Annex B: Methodology and response
Annex C: Legal, regulatory and historical context
Annex D: Key technologies

Executive summary

Neurotechnologies have continued to proliferate in the health and research sector over the past decade and may soon become part of our daily life. Our workplaces, home entertainment and wellbeing services may use neurotechnology to provide more personalised services in the years to come.

As the UK’s data protection regulator, the Information Commissioner’s Office (ICO) aims to increase public trust in how organisations process personal information through responsible practice. We want to empower people to safely share their information and use innovative products and services that will drive our economy and our society. In our ICO25 strategy, we committed to set out our views on emerging technologies to reduce burdens on businesses, support innovation and prevent harms.

This report specifically considers gathering, analysing and using information that is directly produced by the brain and nervous system, referred to as neurodata. This ranges from monitoring concentration levels at work, to more distant concepts such as smart prosthetics that can mimic brain patterns for greater responsivity. This report  is a short introductory guide for those who wish to know more about neurotechnologies from a regulatory perspective. It does not consider the implications of neurodata inferred from broader biometric information, such as eye movements, gait or heart rate tracking. This formed part of our earlier work around biometric technologies.

We examine the impact of neurotechnologies and neurodata and analyse their impact on privacy. We explore plausible scenarios and use cases for emerging neurotechnologies, and through these, raise the following issues:

We will address these areas of concern through:

We will address some other issues elsewhere, as we build on our Artificial Intelligence (AI) Framework and forthcoming guidance on workplace surveillance. This will include potential neurodiscrimination arising through inaccurate information or inappropriate processing and decision-making.

Introduction

The term neurotechnology can often bring to mind science fiction narratives, where machines can read our thoughts or harness our brain power to control robotic devices. Yet, this narrative obscures the powerful reality of how neurotechnologies are already revolutionising people’s daily lives.

In recent years, we have seen rapid advances in devices and methodologies that can predict, diagnose, and treat complex physical and mental illnesses by using information directly taken from the brain. Such capabilities have led to complex scientific research on how the human brain functions. This knowledge can be harnessed to provide support in the workplace, education and sport, as well as identifying new ways to entertain ourselves.

There is significant government, private sector, and public interest in the field of neurotechnology and a clear sense of the opportunities and benefits that are emerging.1 We also recognise the need to proactively scope for future potential risk and harms that could undermine the benefits of neurotechnological advances. Our work monitoring the speed and breadth of development and anticipating future applications will help us respond in a timely and proactive manner.

This analysis aims to support our ability to protect people, provide clarity for businesses and enable privacy-positive innovation. It is aimed at:

To increase understanding of possible future uses of neurodata, this report explores likely scenarios and use cases for emerging technologies and solutions in this space. These illustrate potential deployments across a number of sectors including health, employment and education. The scenarios raise key issues about gathering and using neurodata, which we examine to better understand critical challenges around emerging neurotechnologies and privacy.

We intend to address these issues through continuing proactive work with stakeholders and the public, as well as further cross-regulatory work. We are asking for views from interested organisations at the end of this report and in the longer term we are also aiming to create guidance.


1 Global Neuroethics Summit Delegates; Rommelfanger KS, Jeong SJ, Ema A, Fukushi T, Kasai K, Ramos KM, Salles A, Singh I. Neuroethics Questions to Guide Ethical Research in the International Brain Initiatives. Neuron. 2018 Oct 10;100(1):19-36. doi: 10.1016/j.neuron.2018.09.021. PMID: 30308169 and Pfotenhauer SM, Frahm N, Winickoff D, Benrimoh D, Illes J, Marchant G. Mobilizing the private sector for responsible innovation in neurotechnology. Nat Biotechnol. 2021 Jun;39(6):661-664. doi: 10.1038/s41587-021-00947-y. PMID: 34099907 and iHuman Neural Interfaces Perspective (royalsociety.org)

Why neurotechnology and neurodata?

Neurotechnologies can offer significant benefits and opportunities for people, organisations, and societies. The development of neurotechnologies can:

While technological innovations can offer opportunities and challenge the status quo, they can present new issues and risks that can undermine the progress they promise. We will monitor the speed and breadth of development and anticipate future deployment in order to respond in a timely and proactive manner. It is therefore important that we consider both the potential benefits and harms of these rapidly evolving technologies.

Processing neurodata poses a significant and specific risk to people’s information rights in three distinct ways:

There are a large number of analyses of neurotechnologies. For example, those produced by the Royal Society, the Future Privacy Forum, the Law Society, the European Council, UKRI’s Knowledge Transfer Network, UNESCO, the IEEE and the UK’s Regulatory Horizon Council.2 In many of these, the focus has been either medical applications of neurotechnologies or the broader ethical issues raised by the future uses of neurodata. This includes reports that explore the potential creation of new neurorights in legislation, such as the right to neuroprivacy (see Annex C for details).

Neurorights are not directly analogous to data protection and the requirements of UK data protection law are distinct. However, there is overlap between the two areas and ethical issues remain important, such as neuroprivacy. It is only recently that reports have begun to explore data protection issues directly, often on a global basis.3 We have written this report within this context.

In addition to briefly examining the legal, regulatory and ethical context, we also consider emerging market indicators about neurotechnologies, such as funding and patents. Understanding the broader market is important in assessing which sectors are likely to see markets develop first and what issues may emerge.

At a national level, there is clear evidence that the UK private sector is investing in neurotechnology, with some 34 companies focusing on this sector. On a global scale, investment in neurotechnologies and the creation of related patents continues to increase significantly across a variety of sectors. This growth reflects the potential to develop and deploy neurotechnologies in regions where data protection regimes differ significantly from the UK GDPR. In certain cases, use of these neurotechnologies may not adhere to the expectations we have for fairness and transparency in the way they use personal information. We explore these further In turn, this may pose significant challenges if these devices become common in the UK or are used by those with data rights under the UK GDPR.  

In recent years, consumer devices have become increasingly prominent alongside the historically prominent medical devices. This includes devices focused on psychiatric research, wristband based neural interfaces, and the development of surgically implanted (invasive) and wearable (non-invasive) devices designed to enhance cognition and access to online spaces.4 Read Annex A for further analysis.


2 iHuman Neural Interfaces Perspective (royalsociety.org), FPF-BCI-Report-Final.pdf, https://rm.coe.int/report-final-en/1680a429f3, A-transformative-roadmap-for-neurotechnology-in-the-UK.pdf (ktn-uk.org), 384185eng.pdf (unesco.org) and Neurotechnology Regulation The Regulatory Horizons Council (publishing.service.gov.uk).

3 https://rm.coe.int/report-final-en/1680a429f3

4 Gartner Maverick-_Research__731765_ndx.pdf and 1680a429f3 (coe.int)

Defining neurodata and neurotechnology

There isn’t an agreed definition of neurodata. The recent UNESCO International Bioethics Committee Report on Neurotechnology uses the term neurodata to describe personal brain data. It states that neurodata is “data relating to the functioning or structure of the human brain of an identified or identifiable individual that includes unique information about their physiology, health, or mental states”. This is a definition drawn from the Organisation for Economic Cooperation and Development’s (OECD) Recommendation on Responsible Innovation in Neurotechnology.5

Neurotechnology, another widely debated term, is also defined by the OECD as “devices and procedures that are used to access, investigate, assess, manipulate, and emulate the structure and function of neural systems.”6

There is no explicit description or definition of neurodata (or neurotechnologies) under the UK GDPR or other data protection legislation. Neurodata is likely to link to the ‘mental identity’ definition of personal data under Art 4(1). However, neurodata is not specified in the Article’s text, unlike biometric data (which is discussed in our Biometric: Insight report). The broad category of ‘mental’ identity may only cover aspects of neurodata gathered directly from the brain and may not include data gathered from the nervous system by devices that gather information from the spinal cord, for example. However, for the purposes of this report, neurodata includes information gathered from both the brain and nervous system.

There are contexts and uses that place neurodata clearly within the definition of special category data under article 9 of the UK GPDR. For example, where neurodata relates to health, ethnicity or sexuality. When processing special category data, the UK GDPR puts additional safeguards in place. However, as we will explain in this report, whether neurodata is also special category data is unlikely to depend on the specific technology, but rather on the purpose of processing, in many cases.

ICO definitions

For the purposes of this report, we therefore define neurodata as:

“first order data gathered directly from a person’s neural systems (inclusive of both the brain and the nervous systems) and second order inferences based directly upon this data”.

This helps us to define the scope for this report. We will consider information drawn from both the brain and the neural system, as well as morphological data (data allowing identification as well as classification), but exclude neurodata inferred via biometric technologies and their data.

We then define neurotechnology as:

“consumer, enterprise and healthcare devices and procedures, both invasive and non-invasive, that directly record and process neurodata for the purposes of gathering data, controlling interfaces or devices, or modulating neural activity”.

This definition does not directly include approaches that ‘emulate’ neural activity at this time. This is due to the significant overlap with algorithmic processing that mirrors neural activity without being directly drawn from the source. However, if appropriate, we will consider this in specific circumstances, such as smart prosthetics.

For example, a wearable device such as a headband (a neurotechnology) may gather raw information on brain patterns (neurodata). Through algorithmic analysis it may indicate how alert a person is (first order data). Following this, wider inferences about future performance or behavioural patterns might be extrapolated and even combine with additional data (second order inferences).

While we are focusing on first order data captured from brains, we do not allude to ‘mind reading’. The information produced by devices is often binary, a categorisation of neural responses as ‘either / or’ rather than a detailed picture of a person’s thoughts. This is particularly likely when discussing wearable or non-invasive devices that are not surgically implanted, such as headbands. Long term plans for neurotechnologies may seek to capture phenomenological responses, such as memories evoked by sight, sound or taste, or even images from a person’s mind. However, these remain largely theoretical, lab-based approaches at present. In either case, at this time, more granular information from the brain is largely obtained from invasive technologies that are not accessible to the broader population. Most people are more likely to gain access to wearable brain activity sensing and recording devices in the near term.

We have identified relevant use cases for neurodata that illustrate potential data protection concerns in sectors including employment, entertainment, healthcare and education. Through our research approach set out in Annex B, we have developed the scenarios presented below. Annex C sets out the contemporary legal and regulatory context of this report.

For further supporting information on definitions about neurotechnologies, Annex D sets out a brief chronological and technical exploration of developments central to our analysis.


5 UNESCO International Bioethics Committee. Report of the International Bioethics Committee of UNESCO (IBC) on the Ethical Issues of Neurotechnology Dec. 2021

6 OECD Recommendation on Responsible Innovation in Neurotechnology - OECD

Neurotechnologies: Key definitions

Invasive and non-invasive neurotechnologies

The most fundamental difference between neurotechnologies for the purpose of this report is between invasive and non-invasive devices:

Read and write neurotechnologies

The distinction between non-invasive and invasive neurotechnologies is an important one, both technically and medically. However, it is also critical to note that so-called ‘non-invasive’ devices can still interact in quite intimate ways in the brain. Another way to distinguish between devices is through their capabilities to record and analyse neurodata received and those that stimulate or modulate neuropatterns. Essentially, we can consider both invasive and non-invasive technologies under the following divisions:

The ability to modulate brain activity can apply to both invasive and non-invasive technologies and may significantly increase the risk of processing personal information, which we will consider in this report.

Additional definitions

There are a variety of ways to differentiate neurotechnologies that may influence the way data can be processed and the level of involvement given to a person using a technology. Active devices require a deliberate task or stimulus to generate a neural response, such as finger movement, mental arithmetic or music imagery. Reactive devices require an external cue to record a specific response, such as music, imagery, pain or even a question. Passive devices record subconscious, unprompted and more generalised responses from a person, such as fatigue levels, attention span or arousal.

Synchronous and asynchronous devices are also different; synchronous devices read on a predefined schedule while asynchronous devices allow the users of a device to interact and communicate. Linked to this definition are closed loop and open loop systems. Closed loop neurotechnologies operate on an autonomous basis, reacting or inputting on the strength of their programming and algorithmic processing. Open loop systems are ‘open’ in the sense that the people wearing or implanted with the device can choose when to make an intervention of action via a device.

See examples of technologies deploying these various approaches in Annex D.


7 Although this issue will reduce with improved miniaturisation.

Sector scenarios

We found that the use of neurodata is growing and that the short-term horizon will be a critical period for the emergence and potential uptake of neurotechnologies in many sectors. Research and medical uses of neurotechnologies are relatively advanced. However, several other sectors are expected to expand significantly. Some sectors remain beyond our scope in this report, such as the military uses of neurotechnology. Therefore, we have identified the following sectors where we anticipate that neurotechnology may have a major impact on UK markets on the near horizon (two-seven years):

It is also helpful to explore what the deployment of neurotechnologies within these sectors may (and do) actually look like from a data protection perspective, before examining the issues they could potentially raise.

Please note that these scenarios are intended to explore in brief some possible developments and uses of technology. While the scenarios include high level commentary on relevant data protection compliance issues, you should not interpret this as confirmation that the relevant processing is either desirable or legally compliant. This document does not provide ICO guidance.

In the short term (two-three years), the following sectors are where neurotechnologies are likely to have the greatest impact:

The medical sector

This is an area that is likely to see increased uses of invasive (surgically implanted) neurotechnologies. This may include implanted devices designed to deliver applications including direct brain stimulation (DBS) to treat epilepsy and Parkinson’s disease.8 Other applications may focus on physical needs, with the development of advanced neuroprosthetics. For example, retinal implants that seek to provide visual information directly to the brain, or spinal implants that may assist in recovering mobility. As these potential uses will use special category health data, significant protections under UK GDPR will be in place about processing this information. The presence and influence of other regulatory bodies, such as the Medicines and Healthcare Regulations Authority (MHRA) and General Medical Council (GMC), will also increase the regulatory oversight of this potentially high risk area of processing.

New medical data: speech decoding

In the longer term, medical neurotechnologies may enable new forms of assistive communication. Research into cortical implants is already being explored as a way of electronic communication. Beyond the need to appropriately handle special category data appropriately, these technologies are likely to support people in highly vulnerable situations. While there are key benefits to these technologies, there are also risks.  Speech translation, for example, presents the potential to misrepresent what a person has said, or to reveal thoughts that might otherwise have been private or meant to be edited before sharing. In both cases highly sensitive information could be revealed with no way to recover it, leaving a person at significant risk from mental and emotional harm.

Neuromodulation may be used as a treatment for people dealing with addiction or complex psychological needs for which no current treatment is available or has been successful. Designed to impact a person’s behaviour on a long-term basis, this may present risks to both accuracy and fairness. If neurodata is inaccurately gathered or interpreted due to issues with a device or algorithmic bias, it may lead to significant harms due to inappropriate treatment or even treatment being withheld.

Non-medical data that provides medical insight

It is also likely that non-invasive wearable neural devices will become increasingly accessible to consumers. These will have the ability to gather, and potentially share, increasingly granular data with healthcare providers. These devices may be marketed as wellbeing and fitness trackers. By sharing this data with healthcare providers, this may allow both physical and psychological medical care to become further personalised. While this offers the opportunity for targeted and cost-effective treatment, it also raises the prospect of complex data sharing. This could lead to challenges around transparency and access to data-driven decisions, and an increased pressure to repurpose data for research purposes. While the distinction between physical and mental data may remain highly debated, the key privacy categorisation remains whether or not the information is special category personal data or personal data.

In addition, the broader wellbeing sector may develop further with consumer targeted devices. These could be used to monitor a user’s mood, responses, and even to modulate neural patterns on a more general basis than the medical devices described above. This is likely to blur the line between wellbeing and health devices. In turn, it may alter the category of data from personal (consumer) data to special category (health) data and therefore mean different requirements for processing.9

Wellbeing neurotechnologies are likely to raise issues of transparency and could present possible complex inferences that people may interpret as medical advice. Issues of automated processing via closed loop devices may also leave people uncertain about how to exercise their information rights. We explore these challenges further in the ‘Issues’ section.

Professional sports

This sector may also see increased uses of non-invasive neurotechnologies outside medical treatment. Initially, organisations may use devices and neurodata to analyse professional athletes’ responses to stimulus and concentration levels.  They could also be used to track concussive injuries and their long-term effects. However, significant questions already exist about the ownership and use of such information in professional sports as players’ contracts are sold and values negotiated before considering issue such as appropriate use and purpose of information.

In the longer term, uses may move towards devices seeking neuroenhancement by improving reaction times and muscular response to neural activity; potentially allowing athletes to run faster, jump higher and throw further. These uses raise significant ethical and social concerns, as possible precursors towards broader public neuroenhancement that go well beyond the scope of data protection legislation, such as the UK GDPR. However, under the UK GDPR issues of fairness and transparency, as well as appropriate lawful bases for processing, will continue to be relevant to personal information collected using such novel technologies.

In the medium term (four-five years):

Future of work: employer access or ownership of neurodata

Future human resources departments may be faced with another task; processing neurodata. In all the scenarios below, organisations need to consider compliance with all applicable data protection rules, including:

Workplace safety

The employment sector is likely to make increasing use of non-invasive neurotechnology to measure, record and process a variety of personal information.10 While employee monitoring is already a contentious area of processing, EEG systems may be integrated as part of a health and safety or risk management scheme. This could see helmets or safety equipment that measure the attention and focus of an employee rolled out in high risk environments. For example, around heavy machinery or a large vehicle, especially combined with long working hours.

Workplace wellness

Our research has indicated that employee monitoring with the stated purpose of enhancing and enabling workplace wellness within the office environment is already being explored. Wearable neurotechnologies are being worn by employees to help them and their employers have greater awareness of employee engagement and stress. However, biometric based monitoring technologies, such as gaze and gait tracking, may be perceived as a cheaper, more accurate and easier-to-deploy alternative.

Employee hiring

Finally, workplaces could see increased use of neurodata recording techniques as part of the recruitment process. This will aid organisations who want to identify people who fit desirable patterns of behaviour or perceived traits, like executive function.11 Research that combines biometric measures and organisational psychology has been called by some ‘neuromanagement’.12 

Workplace use of neurotechnology presents numerous risks and challenges for data protection. Conclusions drawn from information may be based in highly contested definitions and scientific analysis of traits, as we explore in the below section on regulatory issues. They may embed systemic bias in the processing, discriminating against those who are neurodivergent. Finding an appropriate basis for processing is likely to be complex and organisation will need to consider fairness, transparency and data retention.

Consumer data from the gaming industry

The entertainment sector has already begun to use neurotechnology for home entertainment. Games now exist allowing a player to remotely control drones via read-only neurotechnology that analyses and interprets information from the player’s brain. While these may make limited use of sensitive personal information due to the inputs required, it nevertheless may be more likely to increase the risk of excessive information collection and retention. 

Neurodata-led gaming is likely to emerge rapidly in the medium term. There is the possibility that single-player games will develop in highly limited formats with basic gameplay. But the challenge facing these will be meeting customer expectations when players are used to complex, high fidelity and online systems, as well as the additional costs of specialised equipment. Neurodata-led games at this stage are likely to focus on simple puzzle mechanisms rather than a sharing of neurodata between participants. Alternatively, other EEG based entertainment devices may focus on the control of devices such as drones, offering hands-free control of the device. Key data protection challenges for organisations offering consumer neurotechnology will lie in providing clear, intelligible descriptions of complex information gathering and automated processing, as well as ensuring that people’s information rights are accessible and implementable.

While the development of read-based neurotechnologies is likely to be limited in the medium term, there may be a significant uptake in the use of modulating technologies aimed at gamers. These devices may claim to boost response times and improve player’s concentration and multi-tasking capacity. Given the size of the professional gaming economy, this is likely to generate questions of fairness and competitiveness. In addition, data protection concerns will remain about how this information is held and analysed, and what risks may be posed should people choose to share it without fully understanding its potential uses and inferences.

In the long term (five-seven years):

Student neurodata

The increasing enthusiasm for integrating neuroscience into the design of educational programming has more recently included wearable neurotechnologies for children. Initial uses have received mixed receptions, including the termination of a project where the public demanded the removal of wearable brain monitoring devices of children in China.13 The higher education sector may seek to make use of wearable BCIs, such as EEGs, to measure students’ concentration levels and stress levels, as well as offering neuromodulation of cognitive processes to boost student performance. These technologies are likely to build on those already developed for the wellbeing sector. They may use different software interfaces and far more long-term tracking of information linked to academic performance. Devices may offer increasingly personalised approaches to learning; highlighting areas where students excel or struggle.

The expected delay in deploying neurotechnologies to the education sector is likely to be based on ethical concerns, rather than technological barriers. In particular, whether there should be any attempts to use the technology for children, rather than adult students. Issues of consent, financial accessibility and potential discrimination are likely to be critical in developing appropriate uses of neurotechnologies within education settings.

Consumer insights data

Another area that may see initial market development in the medium term is direct to consumer neuromarketing. Neuromarketing is a well-established practice of market researchers who utilise information about recordings from the brain to determine product development and advertising within tightly controlled environments. 

In the future, non-invasive devices capable of reading responses may be used at home to tailor consumer preferences. This could include neurotechnology-enabled headphones that might target advertising and commercials of a variety of goods, similar to cookie-enabled tracking online. This can be used to populate new tailored responses based on people’s use of search engines. Alternatively, these technologies may integrate with virtual reality devices, seeking to tailor advertising in virtual environments.

However, these approaches may remain on the fringes of the market due to the following factors:


8 The latter approach is considered a high-risk procedure by some and has been seen to cause impulsive behaviour which may limit its uptake on the near horizon.

9 Lawful basis for processing | ICO

10 Given the associated risks and relatively early-stage development, it is highly unlikely that invasive BCIs would be used in an employment context.

11 https://www.sciencedirect.com/science/article/abs/pii/S0167923623000052

12 Frontiers | Job Assessment Through Bioelectrical Measures: A Neuromanagement Perspective (frontiersin.org)

13 Brainwave-tracking start-up BrainCo in controversy over tests on Chinese schoolchildren | South China Morning Post (scmp.com)

Regulatory issues

Issue 1: Regulatory definitions

Personally identifiable neurodata is always considered to be personal information irrespective of purpose. However, there is no explicit definition of neurodata as either a specific form of personal information or special category data under the UK GDPR. Therefore, organisations need to carefully consider both:

The key challenges include:

Medical neurodata

When neurodata is collected and processed for medical purposes, for example, it is likely to be special category health data (SCD) under Article 9(1) of the UK GDPR. It therefore requires a lawful basis for processing under Article 6 and satisfaction of a condition for processing special category data under Article 9. Organisations can identify an appropriate basis for processing and consent may be the appropriate lawful basis and special category condition.

Some groups, such as the private enterprise Neurorights Foundation, have recommended that explicit consent is provided before neurodata is processed in every case.14 We should handle such calls carefully; while medical consent remains a distinct and important issue, explicit consent for data processing is only one of a variety of appropriate special category conditions under the UK GDPR. It is not inherently ‘better’ than other conditions; organisations should consider carefully what is most appropriate.

Any wider automatic reliance on consent for using personal information for consumer purposes could also cause confusion and may well be inappropriate under the UK GDPR. The wider dialogue and calls for the use of consent may lead people to assume they have the right to automatically withdraw consent to organisations using their information. In fact, organisations may use other appropriate bases for processing and it is for them to be transparent about which basis they have used and what rights are applicable. Instead of always focusing on consent transparency of processing may prove more effective in helping people understand how organisations are using their information.

Personal, but rarely special category biometric data

In rare cases, organisations may directly use neurodata to identify or verify a natural person. In this case, it is special category biometric data that also falls under Article 9(1) of the GDPR. However, while technically feasible, it is likely that most uses will be classificatory, as explored in the scenarios. This is due to the expense and complexity of identifying people in this way compared to other robust biometric methodologies. Where the information may allow organisations to identify people, neurodata may also be biometric data under Article 4(14) of the UK GDPR. It is therefore personal information but not special category data. (Special category biometric data requires organisations to process personal information for the purpose of unique identification). Organisations processing personal information need to consider when and how the information they are using may allow a person to be identified and what the likely impact may be.

Classificatory neurodata

By contrast, organisations may extensively use some large scale neurodata without applying the additional safeguards for processing special category data.

For example, many of the above scenarios discuss classifying people emotionally and behaviourally, for purposes including employment, wellbeing, or entertainment. There is therefore a risk of further profiling or even de-pseudonymisation. This is due to the complexity of the information gathered and the increased ease with which information can be associated with a person. Organisations could purposefully link information to a person post-identification or verification in order to realise its maximum benefit.

In these cases, organisations may have information that may not meet the Article 9 UK GDPR definition of special category data, but still might carry substantial harm if misused. (In particular, loss of autonomy, discrimination, chilling effects and personal distress on a personal level).15 Large scale processing of such information is likely to pose a challenge to encouraging best practice. This highlights the need to consider neurodata as high impact and high risk even when used in contexts that do not explicitly count as special category data. Finally, organisations also need to remain aware about whether personal information may become special category data. For example, tracking employee information, such as concentration that could reveal mental health data.

High risk neurodata

There are robust protections in place for processing all personal information under the UK GDPR. For organisations processing neurodata it is important to be clear about when neurodata is considered data about health for the purposes of Article 9 UK GDPR. This is an issue we have explored further in our recent Technology Horizons Report. Organisations should not assume that neurodata is immediately health data simply because it derives from a person’s physiology. Organisations must also be clear about when complex processing involves processing biometric data and the situations when biometric data is special category data.

Issue 2: Neurodiscrimination

Processing neurodata is particularly novel and poses a significant risk because of the intimate nature of the personal information that it could reveal. Neurotechnology can collect information that people are not aware of. Neurodata can include estimations of emotional states, workplace or educational effectiveness and engagement, and medical information about mental health, among many other types of data. Organisations can significantly risk people’s data protection rights by collecting and further processing these categories of personal information.

Neurotechnologies pose a particular risk if they analyse emotion or complex behaviour (rather than the level of concentration or indication of a neurodegenerative pathology for example). The science underpinning the analysis of human emotion is highly debated (as we have explored in our Biometrics Foresight report). Many stakeholders and scholars have significant concerns about the ability of algorithms to accurately detect emotional cues. The process of drawing such complex inferences from sets of quantitative human brain data is expected to remain enormously challenging.

As organisations derive and analyse increasingly large data sets, new forms of discrimination may emerge that have not been previously recognised under associated legislation, such as the Equality Act 2010. Without robust and independent verification of these models, there is a risk that these approaches will be rooted in systemic bias and likely to provide inaccurate and discriminatory information about people and communities. This information may then feed into automated systems in many instances. It may then raise further questions over Article 22 processing and transparency (which sets out rights related to automated processing and profiling discussed above). In particular, neurodivergent people may be at risk of discrimination from inaccurate systems and databases that have been trained on neuro-normative patterns.

Alternatively, active, rather than systemic, discrimination may emerge. Organisations may view specific neuropatterns and information as undesirable, if they are not considered a protected characteristic under current legislation, such as the Equality Act 2010. People may experience unfair treatment in the work or services they are offered on the basis on their perceived emotional states or even previously unrecognised or undiagnosed physical or mental conditions.

Discrimination may also occur through devices; not just through organisations collecting and using their personal data (described above). Experts have noted that risks can emerge when devices are not trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable. This may be as simple as ensuring that devices sit appropriately and comfortably in order to gather accurate and appropriate information. If this does not take place, there is an increased risk that data sets become biased and incomplete due to device calibration issues.

As noted above, in non-medical contexts neurodata may not be classified as special category data. This reduces the legal safeguards and restrictions around its processing. This may result in organisations failing to implement best practice. For example, around technical security, in order to ensure that neurodata remains safe from loss or theft. This risk around classificatory nature of the information is also discussed above.

Issue 3: Consent; neurodata and appropriate bases of processing

Are there any circumstances in which a person can provide fully informed consent to organisations to use their personal information when they are not aware of what the exact nature of this information is? This is the fundamental question when considering whether organisations can obtain valid consent for processing neurodata. When using neurodata that does not meet the threshold for special category data, organisations must still identify a lawful basis for processing personal data under Article 6 of the UK GDPR. Potentially relevant bases organisations should consider for commercial purposes are consent, legitimate interest and performance of a contract.

For example, if a person is using an EEG headset to improve their online gaming performance, can they truly be aware of and understand the precise nature of the information that they are likely to reveal? Can the organisation also know this? Further heightening the risks of using consent is the fact that many people are unlikely to possess the technical knowledge about collecting and using neurodata to fully understand the information flows. However, organisations may consider whether they can provide specific guarantees about the inferences that they intend to draw from the information they gather in order to obtain valid consent. Where organisations rely on this, they should review our guidance on the use of consent as a basis for processing.

Even within scenarios about employment, organisations must demonstrate a clear need for using neurodata over other techniques for gathering the information. Given the power imbalance between employer and employee, it is likely that consent is not the appropriate basis for processing in most cases. 

When consent for processing is inappropriate, organisations also need to consider when using legitimate interest or contractual obligation is appropriate. This may prove particularly important for entertainment or wellbeing processing. It may prove difficult to pass the three-part test for legitimate interest in such cases. This is because of the high risk and intimate nature of the information derived by devices, as well as difficulty in setting out clear expectations and understandings for people about what information they may provide.   

As noted above, we already provide guidance on the bases for processing under the UK GDPR that any organisation planning on processing neurodata should review.

Issue 4: Closed-loop processing poses heightened risks around shifts in purpose and in automated processing

Expert stakeholders have raised concerns with us that closed loop processing will become increasingly prevalent across emerging neurotechnology devices. These devices will use automated algorithmic processing that assesses personal information in the form of electrical patterns from the brain. They will take automated action unprompted by the user and without significant human intervention. Closed-loop processing is being explored to enhance clinical function of neurotechnologies, particularly implantable devices. Closed-loop neurotechnology, which often uses AI or machine learning (ML), can heighten the risk of inappropriate automated processing. Under Article 22 of the UK GDPR, people “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” unless the appropriate conditions for processing are met under Article 22 (2). While one of the conditions is explicit consent, as noted above, this is not without its own challenges.

Our guidance about automated decision-making and profiling sets out that a decision that has a ‘similarly significant’ effect is something that has an equivalent impact on a person’s circumstances, behaviour or choices. As explored through the scenarios above, neurotechnologies and their associated processing may have a significant impact on peoples’ behaviour (eg by affecting concentration, productivity or sleep). Where appropriate conditions for the solely automated processing of information exists, this presents a significant challenge. Meaningful human intervention under the UK GDPR must be able to challenge and, if necessary, reverse decisions. This may not be possible with neurostimulation or brain to speech outputs for example. Organisations must consider what appropriate intervention may look like for neurodata and neurotechnologies. 

For example, device parameters may have been previously set (and altered in the future) by users to define how their information is processed. They may be reviewed at intervals by the organisation for quality and research purposes. However, this is unlikely to meet the requirements for meaningful intervention under Article 22. This is because it occurs before the processing, rather than after it. Organisations also need to consider the role of the person in the data flows; inputting data or parameters alone is still likely to lead to solely automated processing.

In other uses, such as wellbeing, employment or entertainment, organisations may need to implement appropriate human involvement as an alternative to solely automated processing.

In addition, there is potential for processing for neurostimulation or neuromodulation to fundamentally alter a person’s capacity to evaluate their personal information and make decisions about this. Perhaps, more broadly relevant, is the fact that many people may feel they lack expertise to understand and make decisions on how to interact with a complicated system, especially when the device is the only or best treatment available to them.

Finally, the complexity of closed loop processing may affect both the transparency and accuracy of personal information. Organisations using solely automated decision-making should ensure that they do not breach Article 22. Even where meaningful human intervention is present in a system, organisations should consider our AI guidance. This explains that sufficiently complex algorithmic processing may be considered as solely automated processing due to its complexity and lack of transparency for the device’s users.

Issue 5: Accuracy and data minimisation surrounding neurodata

Gathering and using neurodata could challenge organisations’ ability to comply with the accuracy requirements under the UK GDPR. Reduced accuracy may result from:

Under Article 5(1)(d) of the UK GDPR, personal data must be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’)”.

This raises an important question for neurodata and neuroplasticity. How long does neurodata remain accurate? Some information is permanent or intrinsic, such as a date of birth, genetic information or certain hard biometric data. However, neurodata is in flux from any one moment to another. Neurodata may become more detailed and accurate with advancements in recording and sensing capabilities. If combined with other types of biometric data, it may also reveal new insights that are currently not possible. Organisations should consult our AI guidance when considering thresholds of accuracy compared to the impact of the inferences they are drawing.

Because of this, organisations using neurodata need to ensure that they do not base decisions on singular instances or snapshots of neurodata. This is because many, if not most, techniques for interpreting neurodata rely on significant quantities of comparative data gathered over time to achieve accuracy.

While organisations should gather sufficient information for processing purposes, they need to make it clear that they take decisions at a specific point in time. For example, any decisions or outputs that organisations reach at a particular time may have been accurate at that stage, but they may not be accurate at a later date because of the brain’s neuroplasticity.

There may be an appropriate reason to retain this information, particularly concerning health data and medical treatment. But this connects to the requirements for data minimisation. Organisations should try to retain as little information as they require to provide accurate and fair outputs. They need to achieve a balance between retaining information to ensure accuracy and fairness, while not retaining excessive information.

Issue 6: Neurodata and research

Stakeholders have informed us that neuroscience research requires longitudinal information from additional sources, and especially from outside of the laboratory, due to its developing state. Medical researchers in particular are eager to gain access to information from commercial devices in order to better understand neurodegenerative conditions and especially mental health. This presents potentially complex data flows that could make it challenging for organisations to provide transparency information.

Organisations looking to share their information for this purpose, should read our guidance on research provisions under the UK GDPR.

Issue 7: Information rights (including to be forgotten, portability, of access)

Emerging neurotechnologies may create new challenges to people exercising their information law rights. This is something that any organisation that processes personal information using these technologies must be aware of and responsive to. The following examples highlight some of the issues linked to data rights under the UK GDPR:


14 They propose to demand a 'Hippocratic oath' from technologists (lavanguardia.com)

15 regulatory-policy-methodology-framework-version-1-20210505.pdf (ico.org.uk)

Next steps

We understand the need for further work in this area from a regulatory perspective due to the range of potential uses of neurotechnologies on the near horizon that we have identified in this report. As part of this process, we will continue to scrutinise the market and identify key stakeholders who are seeking to develop or deploy technology in this area. This will help us to continue building our knowledge and understanding of the issues raised. We will also continue to work with stakeholders and others to explain the importance of privacy by design and compliant use of personal information.

We will work with the public to better understand their concerns and questions about emerging neurotechnologies and their personal information.

In the longer term, we are developing specific neurodata guidance as a core part of our ongoing work in this area. It will consider the interpretation of core definitions and approaches, key links to our existing guidance, our views on emergent risks and provide use-based and sector specific case studies to highlight good practice by 2025.

In support of this work, we also want to continue to work with critical stakeholders. We want to hear from organisations who are working in this sector; whether it is in the development of neurotechnologies, their deployment, or through thinking about their implications in a policy based or regulatory context. We would very much like to hear from you as we continue to develop our knowledge and thinking in this area. We can be reached at: [email protected].

Annex A: Market size

The global market for neurotechnology is expected to grow in the coming years, with the potential to generate significant economic benefits. The UK’s Regulatory Horizons Council (RHC) notes that:

“Neurotechnology is predicted to become a significant market with the potential to generate substantial economic benefits, valued at US$17.1 billion globally by 2026, with the largest segments being neuromodulation, neuroprosthetics and neurosensing.”16

While there are a range of estimates available on the overall size of the neurotechnology market,17 you should treat these with a degree of caution.

An OECD paper18 highlights the potential economic value of neurotechnology in the healthcare sector. For example, neurotechnology has the potential to deliver treatments for diseases causing progressive decline in brain functionality, including dementia. Dementia currently affects 50 million people worldwide at an estimated cost of $1 trillion per year.19 Other health conditions that could potentially benefit from neurotechnology include:

Public and private sector activity

Public Sector

Between 2011 and 2020, the UK publicly invested around £98 million ($111 million) in research funding for neurotechnology.21 This funding was split amongst 251 research projects across all regions of the UK. It was largely in academic institutions such as universities, with  the largest concentration being in London and the Southeast (22%).

Despite funding neurotechnology research at an earlier stage, international comparisons show that UK public investment lags other developed economies in absolute and relative terms.

Table 1 shows the scale of public investment in neurotechnology across a range of countries. This includes eight of the 10 largest economies in the world and therefore is likely to capture a high proportion of global public investment in neurotech.22

Table 1: Public sector investment by country

Country

Value of Public Investment

Duration of the investment

Annually (to nearest million)

Investment as a percentage of GDP23

USA

$3.3 billion

2013 to 2023

$300 million

0.0014

EU

$1.08 billion

2013 to 2023

$98 million

0.0006

Korea, Republic of

$42 million

2021

$42 million

0.0025

Australia

$350 million

2016 to 2020

$70 million

0.0050

Japan

$307 million

2014 to 2024

$28 million

0.0005

Canada

$171 million

2011 to 2019

$19 million

0.0011

UK

$111 million

2011 to 2020

$11 million

0.0004

Source: KTN (2021), ICO analysis

UKRI’s 2021 Knowledge Transfer Network (KTN) report also identified a programme of public investment in China. While the size and duration of this investment is unknown, KTN expect it to be ‘substantial’.24

Private Sector

Limited data is available on the number of private sector companies involved in neurotechnology, particularly in the UK. To address this, we have estimated by using FactSet, a data set that compiles information on companies globally from publicly available data sources. We have also been unable to source data on the relevant organisations in China.

We defined number of tags, or search terms, to identify the business activities of firms that are likely to be involved in the neurotechnology sector.25 Table 2 lists the 10 countries with the greatest number of firms operating in the sector, based on FactSet data. Our analysis found that the UK comes in second, with 39 companies in the UK working on neurotechnology development. The United States has the largest population of domestic companies involved in neurotechnology. We were not able to find any evidence of the size of these companies.

Please note that no data was available on the value of private sector investment.

Table 2: Number of firms by country, top 10 globally

Country

Number of firms

United States

386

United Kingdom

39

Canada

34

Germany

22

Israel

20

Switzerland

19

France

17

Italy

14

Australia

12

Spain

10

Source: ICO analysis of FactSet data

Patents

Patents are an alternative measure of private sector activity. They are often used as a proxy for innovation, as well as an indicator of investment in research and development. We gathered data from Lens.org to provide up-to-date analysis.

The database search uses a range of parameters. We considered the country of the inventor to be the best indicator of activity within a country, rather than the area of jurisdiction in which the patent was filed, as multiple patents can be filed. We therefore assumed that inventors make decisions about where to file for a patent based on a myriad of factors.

We used a broad range of search criteria and captured patents from all sectors. As a result, we found a greater number of patents than studies focused on health.26 We examined data from 2013 to 2022, in order to show the current state of the market and recent growth in activity. We searched the following terms:

We’ve set out the three metrics below that gave us insight into private sector activity.

We found that over the past 10 years there has been a three-fold increase in the number of patents granted for neurotechnology annually. In that period, UK inventors were granted 1,780 patents (2.8%), as evidenced in Table 3 below.

Table 3: Patents granted by country of inventor globally, 2013 to 2022

Country

2013

2014

2015

2016

2017

2018

2019

2020

2021

2022

Total

United States 2,709 3,085 3,214 3,765 4,176 4,450 5,916 6,345 6,566 6,661 46,887
Germany 114 139 148 177 235 280 324 343 340 323 2,423
Canada 156 143 135 191 192 205 299 316 327 341 2,305
United Kingdom 111 114 94 128 139 143 217 264 291 279 1,780
Korea, Republic of 20 56 91 124 158 206 248 307 316 398 1,924
Japan 117 141 136 139 173 171 259 277 329 289 2,013
China 24 36 37 72 102 105 176 222 250 342 1,366
Israel 109 107 139 125 154 197 192 220 262 303 1,808
Switzerland 39 53 53 71 87 108 128 154 156 175 1,024
Netherlands 47 87 63 80 86 111 136 159 131 163 1,063
Rest of the world 45 68 91 67 105 138 123 122 160 76 995
Total 3,491 4,029 4,201 4,939 5,607 6,114 8,018 8,729 9,128 9,350 63,606

To consider the importance of the sector to each country, we calculated the number of patents granted by country on a per capita basis. This allowed us to  compare countries with different populations.

Figure 1 below shows the number of patents granted by the country of the inventor per 1,000,000 of population. There has been a small increase in UK inventors working on patents for neurotechnology. However, on a per capita basis, the UK’s contribution to this field is small in comparison. This suggests the UK is not attracting the same level of private sector interest as other countries.

Figure 1: Patents granted by country of inventor per million population, 2013 to 2021

Source: ICO analysis of lens.org data.

Although China’s contribution on a per capita basis is very small, in absolute terms it is significant at 342 in 2022. This is the third highest number amongst this cohort.

This following metric captures the number of inventors by country of residence. This may be a better indicator of the quality of human capital available to each country than capturing the number of patents in metrics 1 and 2.

By this measure, the UK lags its competitors suggesting that comparable countries may have greater human capital in the sector.

Table 4: Inventors of patents around neurotechnology by country, 2013 to 2022

Country

Number of inventors

Population (2021)

Inventors per 100,000 population

United States

53,794

331,894,000

16.2

Germany

4,200

83,196,000

5.0

Canada

3,710

38,246,000

9.7

United Kingdom

3,393

67,327,000

5.0

Korea, Republic of

3,318

51,745,000

6.4

Japan

2,980

125,682,000

2.4

China

2,743

1,412,360,000

0.2

Israel

2,274

9,364,000

24.3

Switzerland

1,903

8,703,000

21.9

Netherlands

1,827

17,345,000

10.5

Source: ICO analysis of lens.org data


16 RHC (2022)

17 For example see: The Market for Neurotechnology: 2022-2026 - Research and Markets; Global Neurotechnology Market Report 2022: Products will be $8.4 Billion in 2018 and will Reach $13.3 Billion in 2022 - Forecast to 2026 - ResearchAndMarkets.com | Business Wire; Global Neurotechnology Market Report 2022-2026 - New Product Categories in Neurorehabilitation and Neurosensing Such as Brain Analysis Systems and Neurorobotics Systems (prnewswire.com)

18 9685e4fd-en.pdf (oecd-ilibrary.org)

19 ibid, p12

20 KTN (2021): A-transformative-roadmap-for-neurotechnology-in-the-UK.pdf (ktn-uk.org)

21 URKI: https://iuk.ktn-uk.org/programme/neurotechnology-landscape/

22 KTN (2021): A-transformative-roadmap-for-neurotechnology-in-the-UK.pdf (ktn-uk.org)

23 Annual investment as a percentage of 2019 GDP: GDP (current US$) | Data (worldbank.org)

24 KTN (2021): A-transformative-roadmap-for-neurotechnology-in-the-UK.pdf (ktn-uk.org), p12

25 We cross-referenced the following tags against business activities in FactSet to identify companies likely to be involved in neurotech: neurotechnology; neurosensing; neuroimaging; neuromodulation;  neuroprosthetics and neurorehabilitation.

26 We noted that terminology shifts across time and countries, meaning that we were not able to completely survey all relevant patents relating to neurotechnology and the shift in the function of associated terminology.

Annex B: Methodology and response

We issued a closed call for views to identified organisations in August 2022. We drew up a list of over 40 organisations across central government, the private sector, civil society, academia and global regulators. We used desk-based research and internal engagement to identify appropriate consultees and we received responses across all the sectors.

We set up interviews to develop our understanding where responses were particularly informative or raised issues we felt it would help to explore  further. These gave us insight into a range of issues, including the priorities of key stakeholders, as well as emerging public and regulatory concerns about the use of neurotechnologies and BCIs.

Stakeholders identified the following areas as key challenges to the effective and appropriate use of neurotechnologies (we may only address some as part of our regulatory remit):

Alongside this engagement, we conducted bibliometric research using tools such as Lens and Google Scholar to identify quantitative data and understand the organisations and trends driving biometrics in the present and future presented in Annex A.

Key drivers include:

Using the above, we developed initial scenarios and then shared them with an external panel of experts. This external workshop drew upon red teaming methodology to critically examine the scenarios and their assumptions, from the drivers used, to the sectors and technologies focused on. We used these to develop the scenarios presented in this report.


27 For example, some have suggested using a risk-based approach for guidance suggesting that this would allow flexibility regarding purpose. What might be high risk of misidentification under security purposes could differ significantly from inaccurate data for advertising purposes.

Annex C: Legal, regulatory and historical context

Understanding the broader legal context relevant to neurodata is important. As noted, the UK GDPR does not explicitly define neurodata as a particular category of personal information or special category personal data. However, Article 4 does set out that ‘mental identity’ is a core aspect of personal information. No explicit definitions are provided about the limits or boundaries of what may constitute mental identity amongst other complex and culturally embedded philosophical contexts.

Beyond the GDPR, Article 3 of the European Charter of Fundamental Rights (the Charter) sets out “the right to respect for his or her physical and mental integrity”.28 However, the Charter has not been incorporated into UK legislation, following the UK’s departure from the European Union (EU), so is best considered for the impact in may have on organisations that are subject to the EU GDPR. The UNESCO International Bioethics Committee argues in its Report on Ethical Issues of Neurotechnology that explicit consent will always be required to write neurodata. However, as discussed below, consent as a basis for processing neurodata (rather than for associated medical procedures) may prove to be complex at the best of times and inappropriate at others, and in need of a multi-layered approach to ensure rights are met.

Early approaches to neurodata governance and rights began to emerge in 2018. Proposals were made to amend the Brazilian General Personal Data Protection (GPDP) law to include neurodata specific sections in Spain’s Digital Rights Charter.29 This was already based on a right to free development of personality. In December 2019, the OECD issued a Recommendation on Responsible Use of Neurotechnology.30 While not legally binding, it offers a beginning to what many in the industry seek; an international standard for research, innovation and deployment in and around neurotechnologies. Key recommendations are phrased as broad principles highlighting priorities for inclusivity, responsible innovation, building trust and safeguarding data.

In December 2020, the first explicit piece of legislation directly about neurodata passed in the amendment to Article 19 of the Chilean Constitution. Following consultation with international experts, the explicit right to neuroprotection was signed into law, preceding the development of a neuroprotection bill. This bill sets out five essential rights,31 including the right to personal integrity, free will, mental privacy and fair and equal access to technologies that can enhance or alter neurological states. As of December 2021, it is under consideration by the Chilean House of Representatives. Please note that the bill refers solely to medical uses of neurotechnology. It also forbids uses in situations where vulnerable communities are placed at risk or when a person’s behaviour may be altered without explicit consent.

The reception of the bill has been varied. Some groups have welcomed both the legislation and the constitutional reform. While others have argued that the new law undermines existing rights and opportunities for those it seeks to protect and could inhibit innovative research into neurological conditions.32 We consider the details of these rights and their intersection with data protection below. But this brief review highlights the broader lack of direct global regulation of non-medical uses of neurodata at this time.

Another parallel area of legislation that may have a significant impact on the development and deployment of neurotechnologies is the EU’s AI Act.33 Requiring that algorithmic processing be evaluated on a risk-based approach, Art 5.2.2 of the proposed legislation states that data defined as posing an ‘unacceptable risk’ covers:

“… practices that have a significant potential to manipulate persons through subliminal techniques beyond their consciousness or exploit vulnerabilities of specific vulnerable groups such as children or persons with disabilities in order to materially distort their behaviour in a manner that is likely to cause them or another person psychological or physical harm.”

This focus on subliminal aspects of processing may intersect with commercial neurotechnologies in non-medical sectors. Any organisation seeking to process personal information will need to consider the thresholds set out and the additional expectations that may exist within and alongside EU GDPR requirements.

A further indication of growing recognition is the increasing number of key publications considering the legal, ethical and practical applications of emerging neurotechnologies on a global scale. In the last five years alone, these include reviews across America, Japan and Europe covering ethical, economic and scientific analyses and high level principles of use.34 However, it is notable that the majority of these focus on scientific and medical communities and that only in two reports, from the Future of Privacy forum (FPF) and the RHC, have privacy and personal information been given significant consideration.35

Neurorights

There is little current legislation directly about gathering and using neurodata. However, increasing calls to consider specific (and fundamental) neurorights have emerged in Chilean legislation.36 The Neurorights Foundation in particular has advocated to create the following five rights (these raise some specific connections to existing rights and obligations under the UK GDPR37):

This right calls for significant restrictions on collecting, storing and commercial use of neurodata, as well as an absolute right to request its deletion. Therefore, there is potential conflict with the UK GDPR which does not set out an absolute right to deletion and seeks an appropriate purposive basis for processing rather than an immediate restriction to use any one category of data.

In terms of data protection, this right raises issues about meaningful consent - should a neurodata focused algorithm have the means to alter our very thoughts. It also links to a growing need for clarification about regulatory interpretation of key terms, such as ‘mental identity’ under Article 4 of the UK GDPR. Beyond the horizon and regulatory scope of this report, there are questions about the very ownership and responsibility for our actions and how the impact of neurotechnologies will compare to existing cognitive treatments.

This right would link to the already established expectation of fairness of processing under the UK GDPR, as well as other legislation such as the Equality Act 2010.

The specific intersection of human rights and mental rights continues to be debated across the neurotechnology community. There appears to be an emerging call by some, for a clearly defined and wide-ranging piece of legislation to formally establish how they fit together. This is driven by the argument that neurodata is key to a sense of self and that the integrity and privacy of neuroprocesses should be fundamentally maintained. Others however, argue that while the issues raised are critical, further risks are posed by ‘rights inflation’; excessive legislation that ignores the existing powers already available to regulatory and legislative regimes.40

While more proposals emerge, two organisations each independently concluded that instead of novel human rights, existing human rights law and regulations to neurotechnology should be tailored to specific contexts.41 These were the Council of Europe, an organisation whose mandate is to uphold human rights law, and UNESCO’s international bioethics committee, whose mandate is to ensure respect for human dignity and freedom. In October 2022, the UN Human Rights Council unanimously adopted a proposal to look more deeply into how existing human rights law might address the potential infringement to human rights related to neurotechnology and a full report will be available in autumn 2023.42

Many neurotechnologies are expected to be developed and deployed at the intersection of multiple complex and emerging technologies. This may include the use of biometric technologies, such as pupillometry to support complex inferences. Or the devices may be designed to be used within digital environments such as the metaverse or to link to additional next-gen internet of things (IoT) devices. These will raise additional privacy and data protection concerns not explicitly explored in this report. But our recent Tech Horizons Report does cover many of the additional issues and considerations that may arise in this area.

History

The modern history of neurotechnology can be traced as far back as the 1780s when Italian physicist Luigi Galvani demonstrated that muscle contractions and nerve conduction are electrically based. By 1875, Dr Richard Caton used a galvanometer to identify electrical impulses from the surfaces of living brains in monkeys. By the early twentieth century, the physiologist Vladimir Pravdich-Neminsky had published the first animal electroencephalography (EEG), with Hans Berger recording the first human EEG in 1924. In 1957, the first direct stimulation of a human auditory system was conducted by Charles Eyries and Andre Djourno, with William House implanting the first neuroprosthetic, a cochlear implant, in 1969.

By 1973, the term ‘brain-computer interface’ was coined by Professor Jacques Vidal as part of his research to analyse EEG signals. This defined a device that allowed brain patterns to control a computer. By 1988, these EEG signals could control a robot. Less than 10 years later, deep brain stimulation (DBS) was approved in the US for the treatment of Parkinson’s disease. The twenty-first century has presented even more rapid progress, with Matt Nagle becoming the first person to use a BCI to control a robotic limb in 2005 and retinal implant systems being developed in 2013. In the past decade new means of analysing brain functions have also emerged, such as lab-based optogenetics. Rather than studying blood oxygenation or electrical impulses, light is used to control modified neurons. Further miniaturisation of previously physically large scale devices with greater use of batteries and wireless technologies is allowing greater mobility and accessibility of neurodata as we move towards what has been termed ‘pervasive neurotechnology’.43 By 2019, a lab based invasive BCI could decode neurodata into text with up to a vocabulary of up to 300 words and an error rate of just 3%.44


28 *BEC Unesco Ethical Issues of Neurotechnology.pdf and Charter of Fundamental Rights of the European Union (europa.eu)

29 Find further details on this. GPDP Art 2 (VII)

30 OECD-LEGAL-0457-en (2).pdf

31 Mind the Gap: Lessons Learned from Neurorights | Science & Diplomacy (sciencediplomacy.org)

32 Novel Neurorights: From Nonsense to Substance - PMC (nih.gov)

33 The Artificial Intelligence Act |

34 *SSRN-id4035992.pdf

35 https://fpf.org/wp-content/uploads/2021/11/FPF-BCI-Report-Final.pdf and Neurotechnology regulation (publishing.service.gov.uk)

36 (PDF) Natural and Artificial Neural Networks: The Chilean Legal Natural and Artificial Neural Networks: The Chilean Legal Framework Framework | Carlos Amunategui Perello - Academia.edu

37 Mission — The Neurorights Foundation

38 https://rm.coe.int/report-final-en/1680a429f3

39 Hallucinations implanted in mouse brains using light (nature.com)

40 Novel Neurorights: From Nonsense to Substance - PMC (nih.gov)

41 "Neurotechnologies and Human Rights: Do we need new rights? " The report of the Council of Europe and OECD round table is published - Human Rights and Biomedicine (coe.int)

42 A/HRC/51/L.3 Vote Item 3 - 40th Meeting, 51st Regular Session Human Rights Council | UN Web TV

43 Towards new human rights in the age of neuroscience and neurotechnology | Life Sciences, Society and Policy | Full Text (biomedcentral.com)

44 High-performance brain-to-text communication via handwriting | Nature

Annex D: Key technologies

Much of the focus of this report is on emerging technology and potential future uses. However, it is worth considering the current state of the art to better understand what is meant by some of the definitions and what technologies are likely to be discussed in the scenarios.45 For example non-invasive (non-implanted) read technologies may include the following:

Invasive (implanted) neurotechnologies that can read and stimulate can include the following:


45 This is a sample of available technologies; many others exist in both research areas and in treatments but are not dealt with here due to the lack of novel privacy implications. For a wider survey see:

46 iHuman Neural Interfaces Perspective (royalsociety.org)