The ICO exists to empower you through information.

Issue 1: Regulatory definitions

Personally identifiable neurodata is always considered to be personal information irrespective of purpose. However, there is no explicit definition of neurodata as either a specific form of personal information or special category data under the UK GDPR. Therefore, organisations need to carefully consider both:

  • when and why neurodata may be special category data; and
  • what risks large scale classificatory uses of even non-special category personal data may pose.

The key challenges include:

Medical neurodata

When neurodata is collected and processed for medical purposes, for example, it is likely to be special category health data (SCD) under Article 9(1) of the UK GDPR. It therefore requires a lawful basis for processing under Article 6 and satisfaction of a condition for processing special category data under Article 9. Organisations can identify an appropriate basis for processing and consent may be the appropriate lawful basis and special category condition.

Some groups, such as the private enterprise Neurorights Foundation, have recommended that explicit consent is provided before neurodata is processed in every case.14 We should handle such calls carefully; while medical consent remains a distinct and important issue, explicit consent for data processing is only one of a variety of appropriate special category conditions under the UK GDPR. It is not inherently ‘better’ than other conditions; organisations should consider carefully what is most appropriate.

Any wider automatic reliance on consent for using personal information for consumer purposes could also cause confusion and may well be inappropriate under the UK GDPR. The wider dialogue and calls for the use of consent may lead people to assume they have the right to automatically withdraw consent to organisations using their information. In fact, organisations may use other appropriate bases for processing and it is for them to be transparent about which basis they have used and what rights are applicable. Instead of always focusing on consent transparency of processing may prove more effective in helping people understand how organisations are using their information.

Personal, but rarely special category biometric data

In rare cases, organisations may directly use neurodata to identify or verify a natural person. In this case, it is special category biometric data that also falls under Article 9(1) of the GDPR. However, while technically feasible, it is likely that most uses will be classificatory, as explored in the scenarios. This is due to the expense and complexity of identifying people in this way compared to other robust biometric methodologies. Where the information may allow organisations to identify people, neurodata may also be biometric data under Article 4(14) of the UK GDPR. It is therefore personal information but not special category data. (Special category biometric data requires organisations to process personal information for the purpose of unique identification). Organisations processing personal information need to consider when and how the information they are using may allow a person to be identified and what the likely impact may be.

Classificatory neurodata

By contrast, organisations may extensively use some large scale neurodata without applying the additional safeguards for processing special category data.

For example, many of the above scenarios discuss classifying people emotionally and behaviourally, for purposes including employment, wellbeing, or entertainment. There is therefore a risk of further profiling or even de-pseudonymisation. This is due to the complexity of the information gathered and the increased ease with which information can be associated with a person. Organisations could purposefully link information to a person post-identification or verification in order to realise its maximum benefit.

In these cases, organisations may have information that may not meet the Article 9 UK GDPR definition of special category data, but still might carry substantial harm if misused. (In particular, loss of autonomy, discrimination, chilling effects and personal distress on a personal level).15 Large scale processing of such information is likely to pose a challenge to encouraging best practice. This highlights the need to consider neurodata as high impact and high risk even when used in contexts that do not explicitly count as special category data. Finally, organisations also need to remain aware about whether personal information may become special category data. For example, tracking employee information, such as concentration that could reveal mental health data.

High risk neurodata

There are robust protections in place for processing all personal information under the UK GDPR. For organisations processing neurodata it is important to be clear about when neurodata is considered data about health for the purposes of Article 9 UK GDPR. This is an issue we have explored further in our recent Technology Horizons Report. Organisations should not assume that neurodata is immediately health data simply because it derives from a person’s physiology. Organisations must also be clear about when complex processing involves processing biometric data and the situations when biometric data is special category data.

Issue 2: Neurodiscrimination

Processing neurodata is particularly novel and poses a significant risk because of the intimate nature of the personal information that it could reveal. Neurotechnology can collect information that people are not aware of. Neurodata can include estimations of emotional states, workplace or educational effectiveness and engagement, and medical information about mental health, among many other types of data. Organisations can significantly risk people’s data protection rights by collecting and further processing these categories of personal information.

Neurotechnologies pose a particular risk if they analyse emotion or complex behaviour (rather than the level of concentration or indication of a neurodegenerative pathology for example). The science underpinning the analysis of human emotion is highly debated (as we have explored in our Biometrics Foresight report). Many stakeholders and scholars have significant concerns about the ability of algorithms to accurately detect emotional cues. The process of drawing such complex inferences from sets of quantitative human brain data is expected to remain enormously challenging.

As organisations derive and analyse increasingly large data sets, new forms of discrimination may emerge that have not been previously recognised under associated legislation, such as the Equality Act 2010. Without robust and independent verification of these models, there is a risk that these approaches will be rooted in systemic bias and likely to provide inaccurate and discriminatory information about people and communities. This information may then feed into automated systems in many instances. It may then raise further questions over Article 22 processing and transparency (which sets out rights related to automated processing and profiling discussed above). In particular, neurodivergent people may be at risk of discrimination from inaccurate systems and databases that have been trained on neuro-normative patterns.

Alternatively, active, rather than systemic, discrimination may emerge. Organisations may view specific neuropatterns and information as undesirable, if they are not considered a protected characteristic under current legislation, such as the Equality Act 2010. People may experience unfair treatment in the work or services they are offered on the basis on their perceived emotional states or even previously unrecognised or undiagnosed physical or mental conditions.

Discrimination may also occur through devices; not just through organisations collecting and using their personal data (described above). Experts have noted that risks can emerge when devices are not trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable. This may be as simple as ensuring that devices sit appropriately and comfortably in order to gather accurate and appropriate information. If this does not take place, there is an increased risk that data sets become biased and incomplete due to device calibration issues.

As noted above, in non-medical contexts neurodata may not be classified as special category data. This reduces the legal safeguards and restrictions around its processing. This may result in organisations failing to implement best practice. For example, around technical security, in order to ensure that neurodata remains safe from loss or theft. This risk around classificatory nature of the information is also discussed above.

Issue 3: Consent; neurodata and appropriate bases of processing

Are there any circumstances in which a person can provide fully informed consent to organisations to use their personal information when they are not aware of what the exact nature of this information is? This is the fundamental question when considering whether organisations can obtain valid consent for processing neurodata. When using neurodata that does not meet the threshold for special category data, organisations must still identify a lawful basis for processing personal data under Article 6 of the UK GDPR. Potentially relevant bases organisations should consider for commercial purposes are consent, legitimate interest and performance of a contract.

For example, if a person is using an EEG headset to improve their online gaming performance, can they truly be aware of and understand the precise nature of the information that they are likely to reveal? Can the organisation also know this? Further heightening the risks of using consent is the fact that many people are unlikely to possess the technical knowledge about collecting and using neurodata to fully understand the information flows. However, organisations may consider whether they can provide specific guarantees about the inferences that they intend to draw from the information they gather in order to obtain valid consent. Where organisations rely on this, they should review our guidance on the use of consent as a basis for processing.

Even within scenarios about employment, organisations must demonstrate a clear need for using neurodata over other techniques for gathering the information. Given the power imbalance between employer and employee, it is likely that consent is not the appropriate basis for processing in most cases. 

When consent for processing is inappropriate, organisations also need to consider when using legitimate interest or contractual obligation is appropriate. This may prove particularly important for entertainment or wellbeing processing. It may prove difficult to pass the three-part test for legitimate interest in such cases. This is because of the high risk and intimate nature of the information derived by devices, as well as difficulty in setting out clear expectations and understandings for people about what information they may provide.   

As noted above, we already provide guidance on the bases for processing under the UK GDPR that any organisation planning on processing neurodata should review.

Issue 4: Closed-loop processing poses heightened risks around shifts in purpose and in automated processing

Expert stakeholders have raised concerns with us that closed loop processing will become increasingly prevalent across emerging neurotechnology devices. These devices will use automated algorithmic processing that assesses personal information in the form of electrical patterns from the brain. They will take automated action unprompted by the user and without significant human intervention. Closed-loop processing is being explored to enhance clinical function of neurotechnologies, particularly implantable devices. Closed-loop neurotechnology, which often uses AI or machine learning (ML), can heighten the risk of inappropriate automated processing. Under Article 22 of the UK GDPR, people “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” unless the appropriate conditions for processing are met under Article 22 (2). While one of the conditions is explicit consent, as noted above, this is not without its own challenges.

Our guidance about automated decision-making and profiling sets out that a decision that has a ‘similarly significant’ effect is something that has an equivalent impact on a person’s circumstances, behaviour or choices. As explored through the scenarios above, neurotechnologies and their associated processing may have a significant impact on peoples’ behaviour (eg by affecting concentration, productivity or sleep). Where appropriate conditions for the solely automated processing of information exists, this presents a significant challenge. Meaningful human intervention under the UK GDPR must be able to challenge and, if necessary, reverse decisions. This may not be possible with neurostimulation or brain to speech outputs for example. Organisations must consider what appropriate intervention may look like for neurodata and neurotechnologies. 

For example, device parameters may have been previously set (and altered in the future) by users to define how their information is processed. They may be reviewed at intervals by the organisation for quality and research purposes. However, this is unlikely to meet the requirements for meaningful intervention under Article 22. This is because it occurs before the processing, rather than after it. Organisations also need to consider the role of the person in the data flows; inputting data or parameters alone is still likely to lead to solely automated processing.

In other uses, such as wellbeing, employment or entertainment, organisations may need to implement appropriate human involvement as an alternative to solely automated processing.

In addition, there is potential for processing for neurostimulation or neuromodulation to fundamentally alter a person’s capacity to evaluate their personal information and make decisions about this. Perhaps, more broadly relevant, is the fact that many people may feel they lack expertise to understand and make decisions on how to interact with a complicated system, especially when the device is the only or best treatment available to them.

Finally, the complexity of closed loop processing may affect both the transparency and accuracy of personal information. Organisations using solely automated decision-making should ensure that they do not breach Article 22. Even where meaningful human intervention is present in a system, organisations should consider our AI guidance. This explains that sufficiently complex algorithmic processing may be considered as solely automated processing due to its complexity and lack of transparency for the device’s users.

Issue 5: Accuracy and data minimisation surrounding neurodata

Gathering and using neurodata could challenge organisations’ ability to comply with the accuracy requirements under the UK GDPR. Reduced accuracy may result from:

  • decisions that may be based on limited information due to an organisation’s desire to minimise the data they use based on cost or regulation; or
  • people and organisations using third party services who may view historic neurodata as current and base decisions on this.

Under Article 5(1)(d) of the UK GDPR, personal data must be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’)”.

This raises an important question for neurodata and neuroplasticity. How long does neurodata remain accurate? Some information is permanent or intrinsic, such as a date of birth, genetic information or certain hard biometric data. However, neurodata is in flux from any one moment to another. Neurodata may become more detailed and accurate with advancements in recording and sensing capabilities. If combined with other types of biometric data, it may also reveal new insights that are currently not possible. Organisations should consult our AI guidance when considering thresholds of accuracy compared to the impact of the inferences they are drawing.

Because of this, organisations using neurodata need to ensure that they do not base decisions on singular instances or snapshots of neurodata. This is because many, if not most, techniques for interpreting neurodata rely on significant quantities of comparative data gathered over time to achieve accuracy.

While organisations should gather sufficient information for processing purposes, they need to make it clear that they take decisions at a specific point in time. For example, any decisions or outputs that organisations reach at a particular time may have been accurate at that stage, but they may not be accurate at a later date because of the brain’s neuroplasticity.

There may be an appropriate reason to retain this information, particularly concerning health data and medical treatment. But this connects to the requirements for data minimisation. Organisations should try to retain as little information as they require to provide accurate and fair outputs. They need to achieve a balance between retaining information to ensure accuracy and fairness, while not retaining excessive information.

Issue 6: Neurodata and research

Stakeholders have informed us that neuroscience research requires longitudinal information from additional sources, and especially from outside of the laboratory, due to its developing state. Medical researchers in particular are eager to gain access to information from commercial devices in order to better understand neurodegenerative conditions and especially mental health. This presents potentially complex data flows that could make it challenging for organisations to provide transparency information.

Organisations looking to share their information for this purpose, should read our guidance on research provisions under the UK GDPR.

Issue 7: Information rights (including to be forgotten, portability, of access)

Emerging neurotechnologies may create new challenges to people exercising their information law rights. This is something that any organisation that processes personal information using these technologies must be aware of and responsive to. The following examples highlight some of the issues linked to data rights under the UK GDPR:

  • The right of access: People are entitled to request all personal information held by organisations (subject to certain exemptions). They should provide it as long as it is identifiable, even if it is complex and may be difficult to interpret, such as raw neurodata.
  • A right to correction: It is likely that it will be increasingly difficult for people to understand when personal information held about them is inaccurate or when organisations have made inaccurate inferences about them. This is because neurodata is highly complex and is likely to require significant technical knowledge to interpret, combined with the challenges of ‘black box’ style algorithmic processing. Organisations should follow our guidance on the lawful fairness and transparency principle to ensure they meet expectations.
  • A right to portability: There is a significant risk that multiple commercial standards of interpreted neurodata may emerge, designed to be used within specific commercial eco-spheres and devices. This could make it harder for people to move and use their information across the systems they want to. Organisations must ensure that neurodata can be taken and transferred to another service by a person, where appropriate under the UK GDPR.
  • A right to erasure: if and when a person asks an organisation to delete their information, it raises questions about how this will impact algorithmic processing when aggregated data is altered. This could impact accuracy, if the personal information can be recognised and removed. While not a unique issue to neurodata, the intimate nature of the information does heighten this risk.

14 They propose to demand a 'Hippocratic oath' from technologists (lavanguardia.com)

15 regulatory-policy-methodology-framework-version-1-20210505.pdf (ico.org.uk)