Article 35(1) says that you must do a DPIA where a type of processing is likely to result in a high risk to the rights and freedoms of individuals:
“Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks.”
What does ‘high risk’ mean?
Risk in this context is about the potential for any significant physical, material or non-material harm to individuals. See What is a DPIA? for more information on the nature of the risk.
To assess whether something is ‘high risk’, the GDPR is clear that you need to consider both the likelihood and severity of any potential harm to individuals. ‘Risk’ implies a more than remote chance of some harm. ‘High risk’ implies a higher threshold, either because the harm is more likely, or because the potential harm is more severe, or a combination of the two. Assessing the likelihood of risk in that sense is part of the job of a DPIA.
However, the question for these initial screening purposes is whether the processing is of a type likely to result in a high risk.
What does ‘likely to result in a high risk’ mean?
The GDPR doesn’t define ‘likely to result in high risk’. However, the important point here is not whether the processing is actually high risk or likely to result in harm – that is the job of the DPIA itself to assess in detail. Instead, the question is a more high-level screening test: are there features which point to the potential for high risk? You are screening for any red flags which indicate that you need to do a DPIA to look at the risk (including the likelihood and severity of potential harm) in more detail.
Article 35(3) lists three examples of types of processing that automatically requires a DPIA, and the ICO has published a list under Article 35(4) setting out ten more. There are also European guidelines with some criteria to help you identify other likely high risk processing.
This does not mean that these types of processing are always high risk, or are always likely to cause harm – just that there is a reasonable chance they may be high risk and so a DPIA is required to assess the level of risk in more detail.
If your intended processing is not described under GDPR, Article 35(3) the ICO list or European guidelines then ultimately, it’s up to you to decide whether your processing is of a type likely to result in high risk, taking into account the nature, scope, context and purposes of the processing. If in any doubt, we would always recommend that you do a DPIA to ensure compliance and encourage best practice.
What types of processing automatically require a DPIA?
Article 35(3) sets out three types of processing which always require a DPIA:
Systematic and extensive profiling with significant effects:
“(a) any systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person.”
Large scale use of sensitive data:
“(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10.”
“(c) a systematic monitoring of a publicly accessible area on a large scale.”
What other factors might indicate likely high risk?
The Article 29 working party of EU data protection authorities (WP29) published guidelines with nine criteria which may act as indicators of likely high risk processing:
Evaluation or scoring.
Automated decision-making with legal or similar significant effect.
Sensitive data or data of a highly personal nature.
Data processed on a large scale.
Matching or combining datasets.
Data concerning vulnerable data subjects.
Innovative use or applying new technological or organisational solutions.
Preventing data subjects from exercising a right or using a service or contract.
For more guidance on these factors, read the WP29 guidelines (WP248). They give background on the reasoning for the high-risk indicators, and examples of processing likely to result in high risk.
In most cases, a combination of two of these factors indicates the need for a DPIA. However, this is not a strict rule.
You may be able to justify a decision not to carry out a DPIA if you are confident that the processing is nevertheless unlikely to result in a high risk, but you should document your reasons.
On the other hand, in some cases you may need to do a DPIA if only one factor is present – and it is good practice to do so.
What does the ICO consider likely to result in high risk?
The ICO is required by Article 35(4) to publish a list of processing operations that require a DPIA. This list complements and further specifies the criteria referred to in the European guidelines. Some of these operations require a DPIA automatically, and some only when they occur in combination with one of the other items, or any of the criteria in the European Guidelines referred to above:
Innovative technology: processing involving the use of innovative technologies, or the novel application of existing technologies (including AI). A DPIA is required where this processing is combined with any of the criteria from the European guidelines.
Denial of service: Decisions about an individual’s access to a product, service, opportunity or benefit that is based to any extent on automated decision-making (including profiling) or involves the processing of special category data.
Large-scale profiling: any profiling of individuals on a large scale.
Biometrics: any processing of biometric data. A DPIA is required where this processing is combined with any of the criteria from the European guidelines.
Genetic data: any processing of genetic data, other than that processed by an individual GP or health professional for the provision of health care direct to the data subject. A DPIA is required where this processing is combined with any of the criteria from the European guidelines.
Data matching: combining, comparing or matching personal data obtained from multiple sources.
Invisible processing: processing of personal data that has not been obtained direct from the data subject in circumstances where the controller considers that compliance with Article 14 would prove impossible or involve disproportionate effort. A DPIA is required where this processing is combined with any of the criteria from the European guidelines.
Tracking: processing which involves tracking an individual’s geolocation or behaviour, including but not limited to the online environment. A DPIA is required where this processing is combined with any of the criteria from the European guidelines.
Targeting of children or other vulnerable individuals: the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or other automated decision-making, or if you intend to offer online services directly to children.
Risk of physical harm: where the processing is of such a nature that a personal data breach could jeopardise the [physical] health or safety of individuals.
You should also be aware that the data protection authorities in other EU member states will publish lists of the types of processing that require a DPIA in their jurisdiction.
Recital 91 says innovative technology concerns new developments in technological knowledge in the world at large, rather than technology that is new to you, and its use can trigger the need to carry out a DPIA. This is because using such technology can involve novel forms of data collection and use, possibly with a high risk to individuals’ rights and freedoms. The personal and social consequences of deploying a new technology may be unknown, and a DPIA can help the controller to understand a control such risks.
Examples of processing using innovative technology include:
artificial intelligence, machine learning and deep learning;
connected and autonomous vehicles;
intelligent transport systems;
smart technologies (including wearables);
market research involving neuro-measurement (e.g. emotional response analysis and brain activity);
some ‘internet of things’ applications, depending on the specific circumstances of the processing.
It is not just cutting-edge technology that might be classed as innovative. If a controller implements existing technology in a new way, this could result in high risks that, unless a DPIA is done, may not be identified and dealt with. For example, doing a DPIA as part of a project to design and deploy a large-scale database system that processes customer details could:
help in deciding what proportionate security measures should be implemented (e.g. protective monitoring);
and act as a reminder that GDPR-compliant contracts need to be in place with any processors.
The ICO list of high-risk processing operations requires a DPIA if your processing involves innovative technology in combination with another criterion from the European guidelines (e.g. evaluation or scoring, or sensitive data).
However, in some cases you may decide that your intended use of innovative technology requires a DPIA without any other factors. As controller, if no mandatory obligation applies, you are responsible for assessing whether your intended processing is ‘likely to result in high risk’.
The GDPR does not define the concept of a legal or similarly significant effect. However, Article 29 working-party guidelines on this phrase in the context of profiling provisions give some further guidance.
In short, it is something that has a noticeable impact on an individual and can significantly affect their circumstances, behaviour or choices.
A legal effect is something that affects a person’s legal status or legal rights. A similarly significant effect might include something that affects a person’s financial status, health, reputation, access to services or other economic or social opportunities.
Decisions that have little impact generally could still significantly affect more vulnerable people, such as children.
‘Invisible processing’ occurs when you obtain personal data from somewhere other than directly from the individual themselves, and you don’t provide them with the privacy information required by Article 14. The processing is ‘invisible’ because the individual is unaware that you are collecting and using their personal data, even if you publish a privacy notice on your website.
This processing results in a risk to the individual’s interests as they cannot exercise any control over your use of their data. In particular, they are unable to use their data protection rights if they are unaware of the processing. This is true even if the processing itself is unlikely to have any negative effect.
You may also be at risk of breaching the fairness and transparency requirements of the first data protection principle if the processing, or any outcome from it, may not be reasonably foreseen by the individual.
For these reasons, processing in this way is only permitted by the GDPR in limited circumstances. These include where to provide the privacy information proves impossible or would involve a disproportionate effort.
Circumstances when it is impossible to provide privacy will only arise rarely, for example where you have no contact details for individuals and have no reasonable means of obtaining them.
It is important that you can demonstrate compliance with individuals’ right to be informed. So, if you are proposing processing operations that involve the use of data obtained from third parties, you must first carefully consider whether you can provide privacy information to the individuals. If you intend to rely on the exception for disproportionate effort, you must be able to justify this, and you must take other measures to protect people’s rights. In particular, you must still publish your privacy information, and carry out a DPIA.
Your DPIA will help you assess and demonstrate whether you are taking a proportionate approach. It will help you consider how best to mitigate the impact on individuals’ ability to exercise control over their data, and whether you can take other measures to support the exercise of their rights. It will also help you demonstrate how you comply with fairness and transparency requirements.
In more detail – ICO guidance
Read the ICO guidance on the right to be informed, which includes a section on disproportionate effort and other exceptions and exemptions.
Further reading – European Data Protection Board
See the WP29 guidelines on Transparency, which have been endorsed by the EDPB.
What does ‘vulnerable individual’ mean?
Individuals can be vulnerable where circumstances may restrict their ability to freely consent or object to the processing of their personal data, or to understand its implications.
Most obviously, children are regarded as vulnerable to the processing of their personal data since they may be less able to understand how their data is being used, anticipate how this might affect them, and protect themselves against any unwanted consequences. This can also be true of other vulnerable sections of the population such as elderly people, or those with certain disabilities.
Even if the individuals are not part of a group you might automatically consider vulnerable, an imbalance of power in their relationship with you can cause vulnerability for data protection purposes if they believe that they will be disadvantaged if the processing doesn’t go ahead.
One group who may count as vulnerable in this sense are employees. The European guidelines on DPIAs (WP248) explain why employees could be considered vulnerable data subjects where a power imbalance means they cannot easily consent or object to the processing of their data by an employer. This type of vulnerability could also arise due to an individual’s financial situation (e.g. credit rating) or the specific context of the processing (e.g. patients receiving medical care).
Processing the data of individuals who may be deemed vulnerable is one of the criteria in European guidelines for processing likely to result in high risk. If you think your processing will involve vulnerable individuals, then a DPIA will be required should any of the other criteria, or operations on our list, be engaged.
A sales firm provides company cars for their employees, and intends to deploy vehicles with location-tracking features, allowing managers to monitor the movement and whereabouts of their employees at all times. Employees are also permitted to use the vehicles for their private purposes outside working hours.
The processing intends to track each individual’s geolocation, in a context where they are vulnerable to a power imbalance with the controller. So this engages the requirement for a DPIA to identify and mitigate the risks to the employees’ rights and freedoms.
You are processing on the basis of legal obligation or public task. However, this exception only applies if:
you have a clear statutory basis for the processing;
the legal provision or a statutory code specifically provides for and regulates the processing operation in question;
you are not subject to other obligations to complete DPIAs derived from specific legislation, such as Digital Economy Act 2017; or
a data protection risk assessment was carried out as part of the impact assessment when the legislation was adopted. This may not always be clear. So in the absence of any clear and authoritative statement on whether such an assessment was done, we recommend you err on the side of caution and do a DPIA to ensure you consider how best to mitigate any high risk.
You have already done a substantially similar DPIA. You need to be confident that you can demonstrate that the nature, scope, context and purposes of the processing are all similar.
The ICO issues a list of processing operations which do not require a DPIA. We have the power to establish this type of list, but we have not done so yet. We may consider a list in future in the light of our experience of how the DPIA provisions are being interpreted in practice.