Skip to main content

What else do we need to consider if Article 22 applies?

Contents

In detail

This type of processing is high risk so you need to carry out a DPIA and introduce other appropriate safeguards, such as providing specific information to individuals about the processing and the rights available to them.

These safeguards and rights are highlighted in the UK GDPR and explained in more detail in the WP29 Guidelines on automated decision-making and profiling.

What’s a DPIA?

A DPIA is a tool to help you assess the risks to individuals from a processing operation and identify ways to address those risks. They are mandatory for processing that is likely to result in a high risk to individuals. For information about the types of processing requiring a DPIA please read our guide to the UK GDPR.

A DPIA can help you decide whether or not the intended processing is going to be subject to the provisions of Article 22. If you already know that this is the case you must carry out a DPIA.

Even if Article 22 doesn’t apply (because the processing isn’t solely automated), you are still required to carry out a DPIA if the processing constitutes:

“a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person”

Article 35(3)(a)

It’s also a good way to meet your accountability obligations by showing that you have:

  • considered the risks involved in any profiling or automated individual decision-making process; and
  • put procedures in place to mitigate those risks and comply with the UK GDPR requirements.

What do we need to tell individuals and why?

You must inform individuals if you are using their data for solely automated decision-making processes with legal or similarly significant effects. This applies whether you have received the data directly from the individuals concerned or from another source. 

You must also provide meaningful information about the logic involved and what the likely consequences are for individuals.

This type of processing can be invisible to individuals so in circumstances where it can have a significant impact on them you need to make sure they understand what’s involved, why you use these methods and the likely results.

How can we explain complicated processes in a way that people will understand?

Providing ‘meaningful information about the logic’ and ‘the significance and envisaged consequences’ of a process doesn’t mean you have to confuse people with over-complex explanations of algorithms. You should focus on describing:

  • the type of information you collect or use in creating the profile or making the automated decision;
  • why this information is relevant; and
  • what the likely impact is going to be/how it’s likely to affect them.

Example

An online retailer uses automated processes to decide whether or not to offer credit terms for purchases. These processes use information about previous purchase history with the same retailer and information held by the credit reference agencies, to provide a credit score for an online buyer.

The retailer explains that the buyer’s past behaviour and account transaction history indicates the most appropriate payment mechanism for the individual and the retailer.

Depending upon the score customers may be offered credit terms or have to pay upfront for their purchases.

What’s the best way to provide privacy information?

You must provide specific information if you carry out automated decision-making described in Article 22(1), namely information on:

“the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing for the data subject.”

Articles 13(2)(f) and 14(2)(g)

Our guidance on the right to be informed explains the different techniques you can use, but it is good practice to use the same medium you use to collect personal data to deliver privacy information.

If you’re operating in an online environment it shouldn’t be too difficult. Layered privacy notices or just-in-time notifications that tell people what you’re going to do with their data at the point you collect it are good approaches to adopt.                      

Privacy dashboards that inform people how their data is used, allow access to the information you hold on them, including details of any profiles and the data input into them, and allow them to manage what happens with it are other useful tools.

If you plan to use personal data for any new purposes, you must update your privacy information and proactively bring any changes to people’s attention.

What other rights do individuals have?

If someone is unhappy with a decision you’ve made using a solely automated process, they can ask for a review. It makes sense for you to explain how they can do this at the point you provide the decision.

You need to show how and why you reached the decision, so it’s important that you understand the underlying business rules that apply to automated decision-making techniques.

You must be able to verify the results and provide a simple explanation for the rationale behind the decision.

The systems you use should be able to deliver an audit trail showing the key decision points that formed the basis for the decision. If your system considered any alternative decisions you need to understand why these were not preferred.

You should have a process in place for individuals to challenge or appeal a decision, and the grounds on which they can make an appeal. You should also ensure that any review is carried out by someone who is suitably qualified and authorised to change the decision.

The reviewer should take into consideration the original facts on which the decision was based as well as any additional evidence the individual can provide to support their challenge.

Example

Your request/application has been declined.

We have made this decision using an automated scoring system that takes into account the information you provided as well as...

If you wish to discuss or appeal the decision then please contact us using the attached form. You can send this electronically or by post.

Our review panel will consider your request and contact you within xxx days with our findings.

Please attach any evidence that you believe might help your appeal. Examples of the types of information that may be useful in the review process are detailed below…

Information about...

Copies of...

Documentary evidence of...

You must act upon the request without undue delay and at the latest within one month of receipt.

You can extend the time to respond by a further two months if the request is complex or you have received a number of requests from the individual. You must let the individual know without undue delay and within one month of receiving their request and explain why the extension is necessary.

Will we need to make any other changes to our systems?

Individuals have a right of access to the same details about automated decision-making that you must provide under your privacy information. If someone submits an access request Article 15 says that you have to tell them about:

“the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing for the data subject.”

Article 15(1)(h)

The UK GDPR says that where possible you should be able to provide remote access to a secure system that provides individuals with direct access to their personal data. This is also a good way for individuals to verify and check that the data you are using is accurate.

Example

An organisation that uses profiling in order to determine insurance premiums allows its customers to inspect and correct any inaccuracies in the personal data used in the profiles assigned to them. As well as addressing any errors, this also helps improve the precision of the system used to carry out the processing.

You need to make sure that you have mechanisms in place to diagnose any quality issues or errors and a process to document how these are resolved.  

These mechanisms should also allow you to check that your systems are working as intended and highlight any inaccuracies or bias.

Consider how you can:

  • introduce sample quality checks on the results from your systems to remove any bias and/or discriminatory effects;
  • ensure any special category data that may have been inferred by the profiling is deleted if it is not required;
  • identify appropriate retention policies for the information you use and keep these under review;
  • implement suitable security measures such as access controls and encryption; and
  • audit your machine-learning tools to check for decision-making rationale and consistency.

Further reading – European Data Protection Board

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the GDPR. EDPB guidelines are no longer directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues.

WP29 published the following guidelines which have been endorsed by the EDPB: