The ICO exists to empower you through information.


In August 2023 we published our draft guidance on biometric data and biometric technologies.

This draft guidance explained how data protection law applies when biometric data is used in biometric recognition systems.

Alongside the draft guidance we launched a public consultation seeking views on the guidance, and the draft summary economic impact assessment that accompanied it.

The consultation was open for nine weeks between Friday 18 August and Friday 20 October 2023.

We received a total of 49 responses, the majority of which came from employers, suppliers of biometric solutions or industry or trade associations.

We are grateful to those who took the time to respond, and those respondents who were willing to discuss with us the points they made in further detail.

Category of respondent Responses
An organisation or person employing workers 12
A supplier of biometric solutions 10
A representative of a professional, industry or trade association 9
An individual acting in a private capacity (eg someone providing their views as a member of the public) 3
An individual acting in a professional capacity 3
An academic 3
A legal representative of a developer or adopter of biometric systems 3
An organisation representing the interests of employees, workers or the self-employed (eg charity, employment advocacy organisation) 3
Civil society group 2
Regulator 1

About the consultation

The main findings of the consultation are summarised below:

  • We received clear feedback that the draft guidance clearly set out what data protection law defines as biometric data, with 81% of respondents agreeing or strongly agreeing with that statement.
  • A clear majority agreed that this clarity extended to the purpose test for special category biometric data (65%), and the concept of unique identifiability (61%). We also received welcome feedback on where confusion remained on this issue.
  • Over a quarter of all respondents disagreed with our use of industry terms to indicate where they align with data protection law. We have taken this feedback onboard.
  • We received clear feedback that the guidance needed to provide more context-specific commentary on compliance with all data protection obligations when using biometric recognition systems.
  • We received positive feedback on the case studies, with 63% of respondents saying these were clear and realistic.

Key themes

Data protection law definitions

Summary of stakeholder response:

A high proportion of responses agreed that the draft guidance clearly set out what data protection law defines as biometric data. Further comments on this question focused on the following areas:

  • requests for examples of the use of behavioural biometrics;
  • further detail on when a photograph or a voice recording could be considered biometric data; and
  • critique of the legal definition of biometric data in data protection law.

Our response

  • We have clarified that this guidance is specifically for biometric recognition. The next phase of our work on biometric data and biometric technologies will focus on biometric classification, which will include behavioural biometrics.
  • Our guidance states that photographs should not systematically be treated as biometric data. We have included more detail on how biometric recognition systems work to explain how the elements of the biometric data definition apply.
  • Our decision to provide guidance on biometric data is to assist organisations in interpreting their legal obligations under data protection law. Any change to these obligations is the responsibility of Parliament.

Further stakeholder responses on definitions: 

Most respondents agreed that the draft guidance clearly set out the different tests of identifiability (for personal information) and unique identification (biometric data). However, the drop from 81% to just under two-thirds (65%) was noted. A similar reduction was seen for the proportion of respondents who said that the draft guidance clearly set out the purpose test for when biometric data became special category data. Further comments on this question raised the following points:

  • There was confusion over unique identification, and to what extent this referred to verification scenarios.
  • There were questions about why the purpose test was relevant to the special category designation, when it isn’t relevant to other forms of special category information.
  • There were requests for scenarios where biometric data would not be considered special category data.

Our response

  • We have provided further detail on the concept of unique identification, and how it applies to biometric data, and all biometric recognition use-cases.
  • The further detail on unique identification now links into the purpose test, where we have reviewed our drafting to clarify how this test applies.
  • We have clarified in the opening section that the focus of this guidance is on biometric recognition. This involves all scenarios where the purpose test is engaged, and special category biometric data will be used. Phase 2 of this project, guidance on biometric classification, will focus on scenarios that do not engage the purpose test.

Use of industry terminology

Summary of stakeholder response:

While a significant majority of respondents agreed with our approach using industry terminology, where appropriate to clarify the scope of data protection law, over a quarter of responses disagreed with this approach.

Further clarity was requested on where specific industry terms came from, and that references to standards should explicitly reference the standard being referred to.

There was also a concern that our use of industry terms risked undermining or narrowing the interpretation of data protection definitions, specifically where biometric data should be considered special category biometric data.

Our response

  • We have carefully considered our use of industry terms in this guidance, specifically the use of the term ‘biometric recognition’.
  • We are confident that our use of this term to describe scenarios where special category biometric data will be used does not constrain our interpretation of data protection law.
  • This approach has been adopted as a device to assist an audience who may have more experience with industry standards and terms than data protection.
  • We have included the full title of industry standards, as well as hyperlinks to them were relevant where organisations wish to undertake further reading.

Data protection obligations when using biometric data

Summary of stakeholder response:

We received clear feedback that the audience for the draft guidance required further clarity on data protection obligations when using biometric data. Of all respondents, 41% disagreed that the draft guidance was clear on wider data protection obligations, while 43% reported it was clear.

We received criticism for our reference to the guidance not being a ‘comprehensive’ guide to data protection compliance.

We also received several questions on the scope of data protection rights, and specific examples of appropriate technical and organisational measures organisations could consider.

Our response

  • We have clarified that this guidance will refer to data protection concepts and principles addressed in other in detail guidance as part of our guide to the UK GDPR.
  • As data protection law is principles-based, new readers may benefit from the further reading references.
  • Following feedback, we have included further references to some of these products within the biometric data guidance (for example considerations around lawfulness, transparency and consent as a legal basis and condition for processing).
  • We have included a new section on the application of data subject rights, with specific reference to biometric data, and some of the challenges this can pose when responding to rights requests.
  • There is also a further section in the guidance on the risks to rights and freedoms, with specific reference to biometric recognition scenarios.
  • We have included further detail on keeping biometric data secure, including specific references to privacy enhancing technologies (PETs) for biometric data. This has included references to international standards relating to information security and biometric systems.
  • We have made clear that adoption of PETs do not remove the accountability obligations such as DPIAs. However, appropriate implementation of PETs will be a relevant factor when considering any mitigations to identified risks to individuals.