The ICO exists to empower you through information.

Neurotechnologies have continued to proliferate in the health and research sector over the past decade and may soon become part of our daily life. Our workplaces, home entertainment and wellbeing services may use neurotechnology to provide more personalised services in the years to come.

As the UK’s data protection regulator, the Information Commissioner’s Office (ICO) aims to increase public trust in how organisations process personal information through responsible practice. We want to empower people to safely share their information and use innovative products and services that will drive our economy and our society. In our ICO25 strategy, we committed to set out our views on emerging technologies to reduce burdens on businesses, support innovation and prevent harms.

This report specifically considers gathering, analysing and using information that is directly produced by the brain and nervous system, referred to as neurodata. This ranges from monitoring concentration levels at work, to more distant concepts such as smart prosthetics that can mimic brain patterns for greater responsivity. This report  is a short introductory guide for those who wish to know more about neurotechnologies from a regulatory perspective. It does not consider the implications of neurodata inferred from broader biometric information, such as eye movements, gait or heart rate tracking. This formed part of our earlier work around biometric technologies.

We examine the impact of neurotechnologies and neurodata and analyse their impact on privacy. We explore plausible scenarios and use cases for emerging neurotechnologies, and through these, raise the following issues:

  • a significant risk of discrimination emerging in non-medical sectors such as the workplace, as complex systems and potentially inaccurate information become embedded in neurotechnology products and services. There may also be an increasing risk that unfair decisions could be made even when accurate information is used, discriminating in ways that have not previously been defined;
  • the need for people to clearly understand the technology and terminology. This enables organisations to meet their requirements for transparency, and enables people to understand their individual rights. Without this, people will be unable to provide clear consent for processing when appropriate and organisations may struggle to address the challenges of automated processing of neurodata; and
  • a need for regulatory co-operation and clarity in an area that is scientifically, ethically and legally complex.

We will address these areas of concern through:

  • ongoing engagement with key stakeholders across industry, regulation, academia and civil society. This will include inviting organisations to work with our Regulatory Sandbox to engineer data protection into these technologies;
  • engagement with the public to better understand their knowledge and concerns about neurotechnologies and privacy; and
  • producing neurotechnology specific guidance in the longer term. This will address the need for regulatory clarity and set clear expectations about the responsible and compliant use of neurodata.

We will address some other issues elsewhere, as we build on our Artificial Intelligence (AI) Framework and forthcoming guidance on workplace surveillance. This will include potential neurodiscrimination arising through inaccurate information or inappropriate processing and decision-making.