The ICO exists to empower you through information.

  • Neurotech poses major risk of being biased if not developed and used correctly
  • Neurodivergent people particularly at risk of discrimination
  • ICO is set to develop guidance for developers of neurotech

The Information Commissioner’s Office (ICO) is warning that newly emerging neurotechnologies risk discriminating against people if those groups are not put at the heart of their development.

The regulator predicts that the use of technology to monitor neurodata, the information coming directly from the brain and nervous system, will become widespread over the next decade.

Neurotech is already used in the healthcare sector, where there are strict regulations. It can predict, diagnose, and treat complex physical and mental illnesses, transforming a person’s responses to illnesses such as dementia and Parkinson’s disease. In May, Gert-Jan Oskam, a 40-year-old Dutch man who was paralysed in a cycling accident 12 years ago, was able to walk again thanks to electronic implants in his brain.

But neurotechnologies are rapidly developing for use in the personal wellbeing, sports and marketing sectors and even for monitoring people in the workplace. If not developed and tested on a wide enough range of people, there is a risk of inherent bias and inaccurate data being embedded in neurotechnology – negatively affecting people and communities in the UK.

“To many, the idea of neurotechnology conjures up images of science fiction films, but this technology is real and it is developing rapidly.

“Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour. The consequences could be dire if these technologies are developed or deployed inappropriately.

“We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination.”

- Stephen Almond, Executive Director of Regulatory Risk

Discrimination in neurotechnology could occur where models are developed that contain bias, leading to inaccurate data and assumptions about people and communities.

The risks of inaccurate data emerge when devices are not trialled and assessed on a wide variety of people to ensure that data collection remains accurate and reliable.

Neurodivergent people may be particularly at risk of discrimination from inaccurate systems and databases that have been trained on neuronormative patterns.

The use of neurotech in the workplace could also lead to unfair treatment. An example of this could be that if specific neuropatterns or information come to be seen as undesirable due to ingrained bias, those with those patterns may then be overlooked for promotions or employment opportunities.

The ICO is developing specific neurodata guidance in the medium term. It will consider the interpretation of core legislative and technical neurotechnology definitions, highlight links to existing ICO guidance, our views on emergent risks and provide sector-specific case studies to highlight good practice by 2025.

The new report – ICO tech futures: neurotechnology – details possible future avenues of development for neurotechnology, including in the workplace and employee hiring, the sports sector, personal health and wellbeing and even marketing and video games.

Notes to editors
  1. The  ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations. 
  2. The ICO can take action to address and change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit. 
  3. To report a concern to the ICO telephone our helpline 0303 123 1113 or go to  ico.org.uk/concerns.