The ICO exists to empower you through information.

Latest update - 9 April 2024

We have updated our guidance on inferred special category data. The guidance no longer focuses on the certainty of an inference as a relevant factor to decide whether it counts as special category data. Our underlying policy position has not changed, but we’re explaining it in a different way to make our position clearer.

This update can be found under the heading ‘What about inferences?’ in the ‘What is special category data’ section of this detailed guidance.

In detail

What are the ‘special categories of personal data’?

The UK GDPR singles out some types of personal data as likely to be more sensitive, and gives them extra protection:

  • personal data revealing racial or ethnic origin;
  • personal data revealing political opinions;
  • personal data revealing religious or philosophical beliefs;
  • personal data revealing trade union membership;
  • genetic data;
  • biometric data (where used for identification purposes);
  • data concerning health;
  • data concerning a person’s sex life; and
  • data concerning a person’s sexual orientation.

In this guidance we refer to this as ‘special category data’.

The majority of the special categories are not defined and are fairly self-explanatory. However specific definitions are provided for genetic data, biometric data and health data.

Why is this data special?

It’s not just that this type of information might be seen as more sensitive or ‘private’. The recitals to the UK GDPR explain that these types of personal data merit specific protection. This is because use of this data could create significant risks to the individual’s fundamental rights and freedoms. For example, the various categories are closely linked with:

  • freedom of thought, conscience and religion;
  • freedom of expression;
  • freedom of assembly and association;
  • the right to bodily integrity;
  • the right to respect for private and family life; or
  • freedom from discrimination.

The presumption is that this type of data needs to be treated with greater care because collecting and using it is more likely to interfere with these fundamental rights or open someone up to discrimination. This is part of the risk-based approach of the UK GDPR.

Whilst other data may also be sensitive, such as an individual’s financial data, this does not raise the same fundamental issues and so does not constitute special category data for the purposes of the UK GDPR. And while data about criminal allegations or convictions may raise some similar issues, it does not constitute special category data as it is covered by separate rules. However, you always need to ensure that when you are processing other types of data, it is fair and meets other UK GDPR requirements (including the separate rules on criminal offence data).

It is also important to be aware that some of the protected characteristics outlined in the Equality Act are classified as special category data. These include race, religion or belief, and sexual orientation. They may also include disability, pregnancy, and gender reassignment in so far as they may reveal information about a person’s health.

These special categories of personal data are framed broadly and may also catch information that is not seen as particularly sensitive. For example, details about an individual’s mental health are likely to be much more sensitive than whether they have a broken leg – but both are data concerning health. Given the potential risks to fundamental rights, it is important that you identify any special category data and approach it carefully, even if you don’t think it is particularly sensitive.

What is genetic data?

The UK GDPR defines genetic data in Article 4(13):

“‘genetic data’ means personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or the health of that natural person and which result, in particular, from an analysis of a biological sample from the natural person in question”.

Recital 34 says this includes chromosomal, DNA or RNA analysis, or any other type of analysis that enables you to obtain equivalent information. (Ribonucleic acid (RNA) plays an essential part in the coding, decoding, regulation and expression of genes).

Not all genetic information constitutes genetic data. The first question is always whether the genetic information is personal data. A genetic sample itself is not personal data until you analyse it to produce some data. And genetic analysis data is only personal data (and so genetic data) if you can link it back to an identifiable individual.

In most cases, you process genetic information to learn something about a specific identified individual and to inform you about taking some action in relation to them. This is clearly personal data – and special category genetic data - for the purposes of the UK GDPR.

However, the definition of personal data also includes identification by reference to “one or more factors specific to the genetic identity of that natural person”, even without their name or other identifier. So, in practice, genetic analysis which includes enough genetic markers to be unique to an individual is personal data and special category genetic data, even if you have removed other names or identifiers. And any genetic test results which are linked to a specific biological sample are usually personal data, even if the results themselves are not unique to the individual, because the sample is by its nature specific to an individual and provides the link back to their specific genetic identity.

However, there are cases where genetic information is not identifiable personal data. For example, where you have anonymised or aggregated partial genetic sequences or genetic test results (eg for statistical or research purposes), and they can no longer be linked back to a specific genetic identity, sample or profile; a patient record; or to any other identifier.


Further reading – ICO guidance

What is personal data?


Further reading – European Data Protection Board (EDPB)

The EDPB, which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the GDPR. EDPB guidelines are no longer be directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues

The EDPB has not yet adopted guidelines on genetic data under the UK GDPR, but you may find it useful to read the 2004 WP29 working document on genetic data (WP91) and WP29 Opinion 6/2000 on the Genome issue (WP35).


What is biometric data?

Article 9(1) includes in the list of special categories of data:

“biometric data for the purpose of uniquely identifying a natural person”.

The UK GDPR defines biometric data in Article 4(14):

“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.

The term ‘dactyloscopic data’ means fingerprint data.


A gym introduces an electronic fingerprint scanning system. Members scan their fingerprint in order to get through the entrance turnstiles. This system is processing biometric data to identify individual members, so the gym needs a valid condition for processing that special category data. 


A school introduces an electronic fingerprint scanning system to charge students for their school meals. This system is processing biometric data to identify the individual students, so the school needs a valid condition for processing that special category data.

Facial imaging and fingerprint data are just two examples, but these are not exhaustive. Many other types of physical, physiological or behavioural ‘fingerprinting’ fall within the definition.

Examples of physical or physiological biometric identification techniques:

  • facial recognition;
  • fingerprint verification;
  • iris scanning;
  • retinal analysis;
  • voice recognition; and
  • ear shape recognition.

Examples of behavioural biometric identification techniques:

  • keystroke analysis;
  • handwritten signature analysis;
  • gait analysis; and
  • gaze analysis (eye tracking).

If you process digital photographs of individuals, this is not automatically biometric data even if you use it for identification purposes. Although a digital image may allow for identification using physical characteristics, it only becomes biometric data if you carry out “specific technical processing”. Usually this involves using the image data to create an individual digital template or profile, which in turn you use for automated image matching and identification.

All biometric data is personal data, as it relates to an identified or identifiable individual. Biometric data is also special category data whenever you process it “for the purpose of uniquely identifying a natural person”. This means that biometric data will be Special Category Data in many cases.

If you use biometric data to learn something about an individual, authenticate their identity, control their access, make a decision about them, or treat them differently in any way, it is likely that this will be processing for the purpose of uniquely identifying that individual and it will involve processing Special Category Data which requires compliance with Article 9.

If you believe you have a specific use case where you are processing biometric data for one of the purposes outlined above but not for the purpose of uniquely identifying a natural person, such that you are not processing Special Category Data, you should document your organisation’s rationale alongside a risk based analysis and evidence for this decision in your DPIA. In doing so you should be able to clearly demonstrate how you are compliant with applicable data protection law and why your processing should not be seen as for the purposes of unique identification. If you identify a high risk that you cannot mitigate, you must consult the ICO before starting the processing.

In more detail

We are planning to produce more detailed ICO guidance on processing biometric data.

We have published an Opinion on the use of live facial recognition technology by law enforcement in public places following the High Court judgment R (Bridges) v Chief Constable of South Wales Police & others [2019] EWHC 2341 (Admin).

EDPB guidelines are no longer directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues. The EDPB has not yet adopted guidelines on biometrics under the GDPR, but you may find it useful to read WP29 Opinion 03/2012 on developments in biometric technologies (WP 193) and WP29 Opinion 02/2012 on facial recognition in online and mobile services (WP 192).

What is health data?

The UK GDPR defines health data in Article 4(15):

“‘data concerning health’ means personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status”.

Health data can be about an individual’s past, current or future health status. It not only covers specific details of medical conditions, tests or treatment, but includes any related data which reveals anything about the state of someone’s health.

Health data can therefore include a wide range of personal data, for example:

  • any information on injury, disease, disability or disease risk, including medical history, medical opinions, diagnosis and clinical treatment;
  • medical examination data, test results, data from medical devices, or data from fitness trackers;
  • information collected from the individual when they register for health services or access treatment;
  • appointment details, reminders and invoices which tell you something about the health of the individual. These fall under ‘the provision of health care services’ but must reveal something about a person’s health status. For example, a GP or hospital appointment in isolation will not tell you anything about a person’s health as it may be a check-up or screening appointment. However, you could reasonably infer health data from an individual’s list of appointments at an osteopath clinic or from an invoice for a series of physiotherapy sessions; and
  • a number, symbol or other identifier assigned to an individual to uniquely identify them for health purposes (eg an NHS number, or Community Health Index (CHI) number in Scotland), if combined with information revealing something about the state of their health.

What about criminal offence data?

Personal data about criminal allegations, proceedings or convictions is not special category data. However, there are similar rules and safeguards for processing this type of data, to deal with the particular risks associated with it. For more information, see our separate guidance on criminal offence data.

What about inferences?

The UK GDPR is clear that special category data includes not only personal data that specifies relevant details, but also personal data revealing or concerning these details.

If the information in itself clearly reveals or concerns something specific and definite about one of the special categories, that counts as special category data. For example, a statement that Mr X is married to Mr Y clearly reveals information about Mr X and Mr Y’s sexual orientation.

If the information itself does not clearly reveal or concern something about one of the special categories, it may still be possible to infer or guess details about someone that do fall within those categories. For instance, you may be able to infer a person’s religion or ethnicity from their name or images of them. This is because many surnames or modes of dress can be associated with a particular ethnicity or religion.

However, you do not have to treat all such names or images as special category data in every instance. For example, you do not need a special category condition just to hold these names or images on a customer database.

Whether or not inferred data counts as special category data and triggers Article 9 depends on whether:

  • your processing intends to make an inference linked to one of the special categories of data; or
  • you intend to treat someone differently on the basis of inferred information linked to one of the special categories of data.

If this is the case, then you are processing special category data regardless of how confident you are that the inference is correct.

If you carry out any form of profiling which infers things like ethnicity, beliefs, politics, health status (condition or risks), sexual orientation or sex life, you will be processing special category data and must identify an Article 9 condition for processing.


A social media user posts a status update on their profile saying that they are at a place of worship which is typically visited by people of a particular religious belief.

By itself, the user’s statement is unlikely to be considered special category data. This is because the visit alone doesn’t necessarily mean that the user holds a particular religious belief.

However, if the social media platform chooses to infer something about the user’s religious beliefs, or decides to treat them differently on this basis (eg to send them targeted advertising), then the processing will involve special category data.



For the purposes of content targeting, a social media provider uses its platform activity data to make inferences about the political beliefs of its users.

Regardless of whether its inferences about the users’ political beliefs are correct, assigning inferred political interests to users and processing their information on this basis (eg to target them with specific adverts and news stories) means that the provider is processing special category data. The provider must identify a condition to process its users’ special category data in this way.

If you are using inferred data to influence how you deal with people, remember that you still have a duty under the accuracy and data minimisation principles to ensure that you are not using personal data that is inaccurate, inadequate or irrelevant in relation to the purposes for which it is being processed.

Even if you’re not drawing an inference or treating people differently, there may still be an obvious risk that other people could (rightly or wrongly) infer something from the data that might be considered sensitive or private. If this is the case (especially if this relates to the special categories of data), you should think carefully about how you ensure fairness and whether there’s anything more you can do to minimise those privacy risks.