At a glance

  • The GDPR has provisions on:
    • automated individual decision-making (making a decision solely by automated means without any human involvement); and
    • profiling (automated processing of personal data to evaluate certain things about an individual). Profiling can be part of an automated decision-making process.
  • The GDPR applies to all automated individual decision-making and profiling.
  • Article 22 of the GDPR has additional rules to protect individuals if you are carrying out solely automated decision-making that has legal or similarly significant effects on them.
  • You can only carry out this type of decision-making where the decision is:
    • necessary for the entry into or performance of a contract; or
    • authorised by Union or Member state law applicable to the controller; or
    • based on the individual’s explicit consent.
  • You must identify whether any of your processing falls under Article 22 and, if so, make sure that you:
    • give individuals information about the processing;
    • introduce simple ways for them to request human intervention or challenge a decision;
    • carry out regular checks to make sure that your systems are working as intended.

Checklists

All automated individual decision-making and profiling

To comply with the GDPR...

☐ We have a lawful basis to carry out profiling and/or automated decision-making and document this in our data protection policy.

☐ We send individuals a link to our privacy statement when we have obtained their personal data indirectly.

☐ We explain how people can access details of the information we used to create their profile.

☐ We tell people who provide us with their personal data how they can object to profiling, including profiling for marketing purposes.

☐ We have procedures for customers to access the personal data input into the profiles so they can review and edit for any accuracy issues.

☐ We have additional checks in place for our profiling/automated decision-making systems to protect any vulnerable groups (including children).

☐ We only collect the minimum amount of data needed and have a clear retention policy for the profiles we create.

As a model of best practice...

☐ We carry out a DPIA to consider and address the risks before we start any new automated decision-making or profiling.

☐ We tell our customers about the profiling and automated decision-making we carry out, what information we use to create the profiles and where we get this information from.

☐ We use anonymised data in our profiling activities.

Solely automated individual decision-making, including profiling with legal or similarly significant effects (Article 22)

To comply with the GDPR...

☐ We carry out a DPIA to identify the risks to individuals, show how we are going to deal with them and what measures we have in place to meet GDPR requirements.

☐ We carry out processing under Article 22(1) for contractual purposes and we can demonstrate why it’s necessary.

OR

☐ We carry out processing under Article 22(1) because we have the individual’s explicit consent recorded. We can show when and how we obtained consent. We tell individuals how they can withdraw consent and have a simple way for them to do this.

OR

☐ We carry out processing under Article 22(1) because we are authorised or required to do so. This is the most appropriate way to achieve our aims.

☐ We don’t use special category data in our automated decision-making systems unless we have a lawful basis to do so, and we can demonstrate what that basis is. We delete any special category data accidentally created.

☐ We explain that we use automated decision-making processes, including profiling. We explain what information we use, why we use it and what the effects might be.

☐ We have a simple way for people to ask us to reconsider an automated decision.

☐ We have identified staff in our organisation who are authorised to carry out reviews and change decisions.

☐ We regularly check our systems for accuracy and bias and feed any changes back into the design process.

As a model of best practice...

☐ We use visuals to explain what information we collect/use and why this is relevant to the process.

☐ We have signed up to [standard] a set of ethical principles to build trust with our customers. This is available on our website and on paper.

In brief

What’s new under the GDPR?

  • Profiling is now specifically defined in the GDPR.
  • Solely automated individual decision-making, including profiling with legal or similarly significant effects is restricted.
  • There are three grounds for this type of processing that lift the restriction.
  • Where one of these grounds applies, you must introduce additional safeguards to protect data subjects. These work in a similar way to existing rights under the 1998 Data Protection Act.
  • The GDPR requires you to give individuals specific information about automated individual decision-making, including profiling.
  • There are additional restrictions on using special category and children’s personal data.

What is automated individual decision-making and profiling?

Automated individual decision-making is a decision made by automated means without any human involvement.

Examples of this include:

  • an online decision to award a loan; and
  • a recruitment aptitude test which uses pre-programmed algorithms and criteria.

Automated individual decision-making does not have to involve profiling, although it often will do.

The GDPR says that profiling is:

“Any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

[Article 4(4)]

Organisations obtain personal information about individuals from a variety of different sources. Internet searches, buying habits, lifestyle and behaviour data gathered from mobile phones, social networks, video surveillance systems and the Internet of Things are examples of the types of data organisations might collect.                             

Information is analysed to classify people into different groups or sectors, using algorithms and machine-learning. This analysis identifies links between different behaviours and characteristics to create profiles for individuals. There is more information about algorithms and machine-learning in our paper on big data, artificial intelligence, machine learning and data protection.

Based on the traits of others who appear similar, organisations use profiling to:

  • find something out about individuals’ preferences;
  • predict their behaviour; and/or
  • make decisions about them.

This can be very useful for organisations and individuals in many sectors, including healthcare, education, financial services and marketing.

Automated individual decision-making and profiling can lead to quicker and more consistent decisions. But if they are used irresponsibly there are significant risks for individuals. The GDPR provisions are designed to address these risks.

What does the GDPR say about automated individual decision-making and profiling?

The GDPR restricts you from making solely automated decisions, including those based on profiling, that have a legal or similarly significant effect on individuals.

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

[Article 22(1)]

For something to be solely automated there must be no human involvement in the decision-making process.       

The restriction only covers solely automated individual decision-making that produces legal or similarly significant effects. These types of effect are not defined in the GDPR, but the decision must have a serious negative impact on an individual to be caught by this provision. 

A legal effect is something that adversely affects someone’s legal rights. Similarly significant effects are more difficult to define but would include, for example, automatic refusal of an online credit application, and e-recruiting practices without human intervention.

When can we carry out this type of processing?

Solely automated individual decision-making - including profiling - with legal or similarly significant effects is restricted, although this restriction can be lifted in certain circumstances.

You can only carry out solely automated decision-making with legal or similarly significant effects if the decision is:

  • necessary for entering into or performance of a contract between an organisation and the individual;
  • authorised by law (for example, for the purposes of fraud or tax evasion); or
  • based on the individual’s explicit consent.

If you’re using special category personal data you can only carry out processing described in Article 22(1) if:

  • you have the individual’s explicit consent; or
  • the processing is necessary for reasons of substantial public interest.

What else do we need to consider?

Because this type of processing is considered to be high-risk the GDPR requires you to carry out a Data Protection Impact Assessment (DPIA) to show that you have identified and assessed what those risks are and how you will address them.

As well as restricting the circumstances in which you can carry out solely automated individual decision-making (as described in Article 22(1)) the GDPR also:

  • requires you to give individuals specific information about the processing;
  • obliges you to take steps to prevent errors, bias and discrimination; and
  • gives individuals rights to challenge and request a review of the decision.

These provisions are designed to increase individuals’ understanding of how you might be using their personal data.

You must:

  • provide meaningful information about the logic involved in the decision-making process, as well as the significance and the envisaged consequences for the individual;
  • use appropriate mathematical or statistical procedures;
  • ensure that individuals can:
    • obtain human intervention;
    • express their point of view; and
    • obtain an explanation of the decision and challenge it;
  • put appropriate technical and organisational measures in place, so that you can correct inaccuracies and minimise the risk of errors;
  • secure personal data in a way that is proportionate to the risk to the interests and rights of the individual, and that prevents discriminatory effects.

What if Article 22 doesn’t apply to our processing?

Article 22 applies to solely automated individual decision-making, including profiling, with legal or similarly significant effects.

If your processing does not match this definition then you can continue to carry out profiling and automated decision-making.

But you must still comply with the GDPR principles.

You must identify and record your lawful basis for the processing.

You need to have processes in place so people can exercise their rights.

Individuals have a right to object to profiling in certain circumstances. You must bring details of this right specifically to their attention.

 

In more detail – European Data Protection Board

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the GDPR.

WP29 has adopted guidelines on Automated individual decision-making and Profiling, which have been endorsed by the EDPB.

Other relevant guidelines published by WP29 and endorsed by the EDPB include:

WP29 guidelines on Data Protection Impact Assessment