The ICO exists to empower you through information.

We issued a closed call for views to identified organisations in August 2022. We drew up a list of over 40 organisations across central government, the private sector, civil society, academia and global regulators. We used desk-based research and internal engagement to identify appropriate consultees and we received responses across all the sectors.

We set up interviews to develop our understanding where responses were particularly informative or raised issues we felt it would help to explore  further. These gave us insight into a range of issues, including the priorities of key stakeholders, as well as emerging public and regulatory concerns about the use of neurotechnologies and BCIs.

Stakeholders identified the following areas as key challenges to the effective and appropriate use of neurotechnologies (we may only address some as part of our regulatory remit):

  • Stakeholders perceive a lack of regulatory coherence across various UK regulators such as the ICO, the MHRA, the Competition and Markets Authority (CMA) and the Office for Product Safety and Standards (OPSS). As well as a perceived lack of legislative coherence across different data protection regimes on a global scale.
  • Linked to the above, they see technology and sector specific guidance as desirable to building both public and stakeholder understanding of what is data protection compliance best practice around the use of neurotechnologies.27
  • The focus of the research and the call for views focused on the means of gathering neurodata. However, stakeholders have continued to highlight the need to address the risks associated with processing data via AI, algorithms and machine learning. In particular, they identified the ongoing risk of systemic and active bias being ignored as technologies are presented as ‘new’ alternatives to previously flawed means of processing, without addressing remaining issues around systemic and active bias.
  • They raised that the complexities in matching established user bases to emerging markets are likely to inhibit the uptake of commercially focused neurotechnologies in the consumer space.
  • They highlighted that the risk that broader ethical discourses on the potential need for new neurorights may obscure the current and near future needs to take effective regulatory action.
  • They believe that commercial uses of neurotechnology will be seen as major sources of data not only for consumer applications, but also for medical research. This will potentially create complex data flows and a lack of transparency for the public about the use and re-use of their data.

Alongside this engagement, we conducted bibliometric research using tools such as Lens and Google Scholar to identify quantitative data and understand the organisations and trends driving biometrics in the present and future presented in Annex A.

Key drivers include:

  • an increased affordability of sensor tech and move towards light-weight, portable and non-invasive (and minimally invasive) devices capable of reading viable information;
  • increased sophistication of supporting technologies such as AI and machine learning, 5G and wireless connectivity, VR and AR devices and cloud storage;
  • a lack of neurodata specific regulation globally, allowing a significant breadth of approaches across technologies and sectors; and
  • ‘datafication’ of people driving new markets as the potential for further large scale data sets emerges.

Using the above, we developed initial scenarios and then shared them with an external panel of experts. This external workshop drew upon red teaming methodology to critically examine the scenarios and their assumptions, from the drivers used, to the sectors and technologies focused on. We used these to develop the scenarios presented in this report.


27 For example, some have suggested using a risk-based approach for guidance suggesting that this would allow flexibility regarding purpose. What might be high risk of misidentification under security purposes could differ significantly from inaccurate data for advertising purposes.