Understanding the broader legal context relevant to neurodata is important. As noted, the UK GDPR does not explicitly define neurodata as a particular category of personal information or special category personal data. However, Article 4 does set out that ‘mental identity’ is a core aspect of personal information. No explicit definitions are provided about the limits or boundaries of what may constitute mental identity amongst other complex and culturally embedded philosophical contexts.
Beyond the GDPR, Article 3 of the European Charter of Fundamental Rights (the Charter) sets out “the right to respect for his or her physical and mental integrity”.28 However, the Charter has not been incorporated into UK legislation, following the UK’s departure from the European Union (EU), so is best considered for the impact in may have on organisations that are subject to the EU GDPR. The UNESCO International Bioethics Committee argues in its Report on Ethical Issues of Neurotechnology that explicit consent will always be required to write neurodata. However, as discussed below, consent as a basis for processing neurodata (rather than for associated medical procedures) may prove to be complex at the best of times and inappropriate at others, and in need of a multi-layered approach to ensure rights are met.
Early approaches to neurodata governance and rights began to emerge in 2018. Proposals were made to amend the Brazilian General Personal Data Protection (GPDP) law to include neurodata specific sections in Spain’s Digital Rights Charter.29 This was already based on a right to free development of personality. In December 2019, the OECD issued a Recommendation on Responsible Use of Neurotechnology.30 While not legally binding, it offers a beginning to what many in the industry seek; an international standard for research, innovation and deployment in and around neurotechnologies. Key recommendations are phrased as broad principles highlighting priorities for inclusivity, responsible innovation, building trust and safeguarding data.
In December 2020, the first explicit piece of legislation directly about neurodata passed in the amendment to Article 19 of the Chilean Constitution. Following consultation with international experts, the explicit right to neuroprotection was signed into law, preceding the development of a neuroprotection bill. This bill sets out five essential rights,31 including the right to personal integrity, free will, mental privacy and fair and equal access to technologies that can enhance or alter neurological states. As of December 2021, it is under consideration by the Chilean House of Representatives. Please note that the bill refers solely to medical uses of neurotechnology. It also forbids uses in situations where vulnerable communities are placed at risk or when a person’s behaviour may be altered without explicit consent.
The reception of the bill has been varied. Some groups have welcomed both the legislation and the constitutional reform. While others have argued that the new law undermines existing rights and opportunities for those it seeks to protect and could inhibit innovative research into neurological conditions.32 We consider the details of these rights and their intersection with data protection below. But this brief review highlights the broader lack of direct global regulation of non-medical uses of neurodata at this time.
Another parallel area of legislation that may have a significant impact on the development and deployment of neurotechnologies is the EU’s AI Act.33 Requiring that algorithmic processing be evaluated on a risk-based approach, Art 5.2.2 of the proposed legislation states that data defined as posing an ‘unacceptable risk’ covers:
“… practices that have a significant potential to manipulate persons through subliminal techniques beyond their consciousness or exploit vulnerabilities of specific vulnerable groups such as children or persons with disabilities in order to materially distort their behaviour in a manner that is likely to cause them or another person psychological or physical harm.”
This focus on subliminal aspects of processing may intersect with commercial neurotechnologies in non-medical sectors. Any organisation seeking to process personal information will need to consider the thresholds set out and the additional expectations that may exist within and alongside EU GDPR requirements.
A further indication of growing recognition is the increasing number of key publications considering the legal, ethical and practical applications of emerging neurotechnologies on a global scale. In the last five years alone, these include reviews across America, Japan and Europe covering ethical, economic and scientific analyses and high level principles of use.34 However, it is notable that the majority of these focus on scientific and medical communities and that only in two reports, from the Future of Privacy forum (FPF) and the RHC, have privacy and personal information been given significant consideration.35
There is little current legislation directly about gathering and using neurodata. However, increasing calls to consider specific (and fundamental) neurorights have emerged in Chilean legislation.36 The Neurorights Foundation in particular has advocated to create the following five rights (these raise some specific connections to existing rights and obligations under the UK GDPR37):
- The right to mental privacy. As noted above, generating and processing neural data relates to subconscious patterns, beliefs and responses. This potentially includes information that we are either unaware of or may never chose to voluntarily disclose. However, given that the information is gathered directly from neural patterns, this choice is removed. This presents two critical issues. Firstly, people may unwillingly and reflexively reveal highly sensitive information. Secondly, they may reveal inaccurate information that leads to complexities around their rights of correction, as well as the social, societal and psychological impacts from a ‘slip of the mind’. Revealing semantic information is not yet possible (such as a specific memory response to a scent for instance). However, the accuracy of invasive techniques and technologies can already provide significant insight.38
This right calls for significant restrictions on collecting, storing and commercial use of neurodata, as well as an absolute right to request its deletion. Therefore, there is potential conflict with the UK GDPR which does not set out an absolute right to deletion and seeks an appropriate purposive basis for processing rather than an immediate restriction to use any one category of data.
- The right to mental integrity (also defined as the right to psychological continuity). While at an early stage of development, BCIs are currently able to modulate neural patterns and affect processes, such as concentration and multi-tasking. Longer-term development may lead to the ability to impact mental states in a precise and focused manner. Laboratory tests have already demonstrated the ability to implant hallucinations within the brains of mice, eliciting responses to these images. This raises long term questions about the read-write nature of neurotechnology.39
In terms of data protection, this right raises issues about meaningful consent - should a neurodata focused algorithm have the means to alter our very thoughts. It also links to a growing need for clarification about regulatory interpretation of key terms, such as ‘mental identity’ under Article 4 of the UK GDPR. Beyond the horizon and regulatory scope of this report, there are questions about the very ownership and responsibility for our actions and how the impact of neurotechnologies will compare to existing cognitive treatments.
- The right to freedom from This occurs when systemic bias arises from the use of algorithms to analyse neurodata or from future research which may identify particular patterns of thinking, mental health states or behaviour that could further generate means to discriminate against neurodivergent people.
This right would link to the already established expectation of fairness of processing under the UK GDPR, as well as other legislation such as the Equality Act 2010.
- The right to fair access to neuroaugmentation for all. While raising significant ethical issues, this right does not fundamentally apply to personal information at this level.
- The right to cognitive liberty (literal freedom of thought rather than the manifestation of thought as currently expressed by European and UN human rights legislation). This right may intersect with requirements under the UK GDPR, such as the need for transparency.
The specific intersection of human rights and mental rights continues to be debated across the neurotechnology community. There appears to be an emerging call by some, for a clearly defined and wide-ranging piece of legislation to formally establish how they fit together. This is driven by the argument that neurodata is key to a sense of self and that the integrity and privacy of neuroprocesses should be fundamentally maintained. Others however, argue that while the issues raised are critical, further risks are posed by ‘rights inflation’; excessive legislation that ignores the existing powers already available to regulatory and legislative regimes.40
While more proposals emerge, two organisations each independently concluded that instead of novel human rights, existing human rights law and regulations to neurotechnology should be tailored to specific contexts.41 These were the Council of Europe, an organisation whose mandate is to uphold human rights law, and UNESCO’s international bioethics committee, whose mandate is to ensure respect for human dignity and freedom. In October 2022, the UN Human Rights Council unanimously adopted a proposal to look more deeply into how existing human rights law might address the potential infringement to human rights related to neurotechnology and a full report will be available in autumn 2023.42
Many neurotechnologies are expected to be developed and deployed at the intersection of multiple complex and emerging technologies. This may include the use of biometric technologies, such as pupillometry to support complex inferences. Or the devices may be designed to be used within digital environments such as the metaverse or to link to additional next-gen internet of things (IoT) devices. These will raise additional privacy and data protection concerns not explicitly explored in this report. But our recent Tech Horizons Report does cover many of the additional issues and considerations that may arise in this area.
The modern history of neurotechnology can be traced as far back as the 1780s when Italian physicist Luigi Galvani demonstrated that muscle contractions and nerve conduction are electrically based. By 1875, Dr Richard Caton used a galvanometer to identify electrical impulses from the surfaces of living brains in monkeys. By the early twentieth century, the physiologist Vladimir Pravdich-Neminsky had published the first animal electroencephalography (EEG), with Hans Berger recording the first human EEG in 1924. In 1957, the first direct stimulation of a human auditory system was conducted by Charles Eyries and Andre Djourno, with William House implanting the first neuroprosthetic, a cochlear implant, in 1969.
By 1973, the term ‘brain-computer interface’ was coined by Professor Jacques Vidal as part of his research to analyse EEG signals. This defined a device that allowed brain patterns to control a computer. By 1988, these EEG signals could control a robot. Less than 10 years later, deep brain stimulation (DBS) was approved in the US for the treatment of Parkinson’s disease. The twenty-first century has presented even more rapid progress, with Matt Nagle becoming the first person to use a BCI to control a robotic limb in 2005 and retinal implant systems being developed in 2013. In the past decade new means of analysing brain functions have also emerged, such as lab-based optogenetics. Rather than studying blood oxygenation or electrical impulses, light is used to control modified neurons. Further miniaturisation of previously physically large scale devices with greater use of batteries and wireless technologies is allowing greater mobility and accessibility of neurodata as we move towards what has been termed ‘pervasive neurotechnology’.43 By 2019, a lab based invasive BCI could decode neurodata into text with up to a vocabulary of up to 300 words and an error rate of just 3%.44
28 *BEC Unesco Ethical Issues of Neurotechnology.pdf and Charter of Fundamental Rights of the European Union (europa.eu)