The ICO exists to empower you through information.

You have now answered all the questions in this toolkit. Below, you will find tailored advice based on the responses you provided. You may find it helpful to download a Word copy of this report in which you can make notes on the advice given to help you track your progress and record your decision-making. If you have an problem downloading the report into a word document please let us know.

You should not view the completion of this toolkit or completion of all recommended actions as verification of data protection compliance. Many aspects of data protection compliance are specific to the unique circumstances of the processing. As such, this toolkit and the advice provided is intended for use as a guide only.

The report should help you start to think about your data protection obligations but it should not be used as a substitute for consulting your data protection officer (DPO). If you have not consulted your DPO, you can use this report as a guide on what to ask them.

Remember, this toolkit is not exhaustive of every factor to be considered when implementing a data analytics system. You should ensure you consider other aspects, including equality and human rights law. Other tools, such as the ALGO-CARE framework, may assist you in this.

 

11 September 2024

Ensure you have documented clearly why you have identified the processing as ‘necessary’

If you believe that the processing is necessary for a law enforcement purpose, you should ensure that you clearly document your reasoning.  This should include a consideration of whether any less intrusive methods of processing could reasonably achieve the same aim, in order to meet the ‘necessity’ test.

The DPA defines law enforcement purposes as ‘the prevention, investigation detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.’ 

We have published guidance on fair and lawful processing under Part 3 of the DPA18.

Identify and document your lawful basis for sensitive processing

Processing the categories of data you have selected constitutes ‘sensitive processing’ as defined by the DPA18, and is therefore subject to additional safeguards. In order for sensitive processing to be lawful, you must meet one of the following two conditions:

  • The data subject has consented to the processing or
  • The processing is strictly necessary for the law enforcement purpose and meets at least one of the conditions listed in Schedule 8 of the DPA18.

‘Strictly necessary’ in this context means that the processing must relate to a pressing social need, and cannot reasonably be achieved through less intrusive means. You will not meet this requirement if you can achieve the purpose for processing through other reasonable means.

In order to undertake sensitive processing, you must have an appropriate policy document in place. This document must explain:

(a) your procedures for ensuring compliance with the law enforcement data protection principles; and

(b) your policies on the retention and erasure of this data.

You must retain this policy from the time you begin sensitive processing until six months after it has ended, undertake regular reviews and keep it up to date. You must also make it available to the Information Commissioner upon request.

You can find more information about sensitive processing in our Guide to Law Enforcement Processing.

Conduct a DPIA

Part 3 of the DPA18 states that where a type of processing is likely to result in a high risk to the rights and freedoms of individuals, the controller must complete a DPIA prior to the processing beginning.

Processing that is likely to result in a high risk includes (but is not limited to):

  • systematic and extensive processing activities, including profiling and where decisions that have legal effects, or similarly significant effects, on individuals;
  • large scale processing of special categories of data (or ‘sensitive data’ when processing for a law enforcement purpose) or personal data relation to criminal convictions or offences;
  • using new technologies (for example surveillance systems).

Therefore, if you are considering using data analytics, then you are required to carry out a DPIA.

A DPIA must contain:

  • at least a general description of your processing operations and the purposes;
  • an assessment of the risks to the rights and freedoms of individuals;
  • the measures envisaged to address those risks;
  • the safeguards, security measures and mechanisms in place to ensure you protect the personal data; and
  • a demonstration of how you are complying with Part 3 of the Act, taking into account the rights and legitimate interests of the data subjects and any other people concerned.

There is no explicit definition of ‘risk’ in the DPA18, but the various provisions on DPIAs make clear that this is about the risks to individuals’ interests. The concept of potential harm or damage to individuals’ links to risk. Examples of risks are where processing may lead to physical, material or non-material damage, in particular; where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage.

Example

A police force is considering using a data analytics system in order to detect incidences of domestic abuse. The system will analyse police data and produce outputs, which predicts  whether a person is a victim or perpetrator of domestic abuse. This prediction will be used to inform police decisions on intervening in situations where domestic abuse is suspected, and therefore could have a very significant impact on people’s lives.

If inaccurate data is fed into this system, a person could be erroneously marked as a perpetrator of domestic abuse, leading to unnecessary and potentially distressing intervention.

Similarly, erroneous or incomplete data could lead to the system failing to detect and safeguard victims of abuse with potentially life-threatening consequences.

Further risks can arise which are inherent to the use of an analytics system. Where human biases are present in the data used to train the system, the system may extrapolate these resulting in discrimination against certain groups.

It is therefore clear that the risks to the individuals affected by this processing activity are high, and a DPIA is required.

You must consider the nature, scope and context of the processing when assessing the risk to individuals’ rights and freedoms. In particular, you should consider that risks you have already identified may be heightened in the context of processing children’s personal data. Children will be less aware of their information rights and how they can exercise them. Additionally, some risks arising from the processing might disproportionately impact children.

Example

The same police force is considering using their data analytics system to identify children who are at risk of domestic violence. The risks differ from those outlined in our previous example; where a child is the victim, they are less able to speak out or seek help for themselves. If the data analytics system fails to identify them as possibly requiring intervention, there could be a serious risk of harm to that child.

The opposite is also true. If a child were incorrectly identified as a potential victim, they and their family could be subject to unnecessary intervention causing harm and distress.

Part of the role of the DPO is to provide advice on carrying out DPIAs. If you are unsure whether a DPIA is needed, you should seek input from your DPO.

Completing a DPIA is a good opportunity to evidence your accountability, and is a useful tool in implementing data protection by design and default.

We have published more information about how to conduct a DPIA. Our guidance on AI and data protection contains specific guidance on DPIAs in the context of AI.

You are required to consult the ICO before you can begin processing personal data

If you have carried out a DPIA that identifies a high risk, and you cannot take any measures to reduce this risk, you need to consult the ICO. You cannot go ahead with the processing until you have done so. 

Your focus should be on ‘residual risk’ after any mitigating measures have been taken. If your DPIA identified a high risk, but you have taken measures to reduce this risk so that it is no longer high, you do not need to consult the ICO prior to processing. 

The Information Commissioner will respond within six weeks. This timescale may be increased by a further month, depending on the complexity of the processing you intend to carry out. You can find out how to contact the ICO for prior consultation here.

Your DPIA should always be reviewed and signed off by your organisation’s DPO. Any advice they have provided should be clearly documented.

Familiarise yourself with the responsibilities of your role as a controller, processor or joint controller

You should be mindful of the role and responsibilities of a controller, and ensure that you have a sufficient understanding of the technologies to be used in order to be able to exercise genuine control over the processing.

If you are a police force and a controller, you should be aware that your Chief Constable will be the controller and ultimately has the responsibility to ensure that personal data is processed in a compliant manner.

The importance of understanding controller/processor relationships is set out in our guidance on the AI auditing framework and is relevant to processing taking place in the context of data analytics.

You have confirmed that logging is in place for your data analytics system

You have indicated that logs are in place for at least:

  • Collection
  • Alteration
  • Consultation
  • Disclosure (including transfers)
  • Combination
  • Erasure

These logs must also record the justification for, as well as the date and time of any access to or disclosure of the data. You must also log the identities of the persons who have consulted, disclosed or received the data.

Review your contracts to ensure that they clearly delineate roles and responsibilities, and will enable you to meet your obligations under data protection law

Where any of these ongoing relationships exist, it is crucial that the controller maintains oversight of operations. Data breaches by third-party processors will have a direct impact on controllers. Even where a controller does not appear to be directly responsible for a particular incident, these types of breaches can erode public trust in the data analytics system – particularly in a law enforcement context.

You are responsible for ensuring your system for data analytics is compliant with data protection law, not the provider. If you procure the system from a third-party supplier that is off the shelf and does not contain inherent explainability, you may need another model alongside it.

Section 59 of the DPA18 sets out a number of terms that must be included in the contract between a controller and any processor.

Review your data sharing agreement to ensure that it clearly delineates roles and responsibilities and supports you in meeting your data protection obligations

The data sharing agreement must set out each controller’s respective responsibilities (for example, for personal data breaches), and must designate which controller is to be the contact point for data subjects.

You should make clear in the agreement how any data shared will be secured, and how any requests from data subjects to exercise their rights will be considered. You should also consider how accuracy will be maintained.

Any data sharing agreement should be reviewed and signed by your organisation’s DPO.

You are legally required to integrate data protection by design and default into your project

Although data protection does not dictate how designers should do their job, if you use data analytics to process personal data, you must show how you have integrated data protection into your project. Examples of how you can mitigate risks to information rights include privacy-enhancing technologies, such as pseudonymisation or anonymisation of personal data.

Controllers should consider that if the information of living persons is being used, even during development, then personal data is being processed. You should ensure that risks around this use of data are documented and mitigated. You may wish to consider the use of synthetic data in the development of your system.

For more information, read our guidance on data protection by design and default in our Guide to Law Enforcement Processing.

Read our guidance on AI and data protection on understanding design choices better, so you can design high-performing systems whilst still protecting the rights and freedoms of individuals. 

Ensure that any competing interests identified are documented and monitored

Competing interests in a data analytics solution should be monitored throughout the project lifecycle to guard against drift and to manage associated risk. You may wish to include this information as part of your DPIA.

Some examples of competing interests and how to manage them are presented in the guidance on the AI auditing framework.

Ensure that risks around the creation of new personal data and any mitigating measures are documented in your DPIA

Your DPIA should identify any risks associated with data creation, and outline how you will meet your data protection responsibilities in respect of new data you have created. Key points to consider include transparency, accuracy and data subjects’ rights.

Consider the following additional advice on individual impact, and document any adverse impact identified

Aside from the direct impact of acting on a system’s recommendation, you should consider other, more subtle impacts. For example:

  • What is the long-term effect of a person becoming a data point in an analytics system?

  • Are they effectively tagged with certain characteristics moving forward?
  • What impact could this have on their life?

You should be clear what the purpose of the processing is, what the intended outcomes are and monitor the deployment of the system for any unintended consequences of the processing or function creep.

Considering the impact of the processing on an individual should form part of your assessment of the necessity and proportionality of processing. You should consider impact on individuals as part of your DPIA.

Ensure that measures are in place to identify, monitor and mitigate the risk of bias and discrimination throughout the project lifecycle

Bias may present itself at any stage of a project lifecycle; from initiation to design, testing through to deployment, sale or repurpose. Data protection law is intended to balance the right to the protection of personal data with its function in society.

Processing of personal data which leads to discrimination and bias will impact on the fairness of that processing. This poses compliance issues with the fairness principle as well as risks to individuals’ rights and freedoms – including the right to non-discrimination.

From the very beginning of any data analysis lifecycle, you should determine and document your approach to bias and discrimination mitigation. This is so that you identify and put in place the appropriate safeguards and technical measures during the design and build phase. You should satisfy yourself that the data used to train your system is representative of the population you are conducting data analysis on.

Measuring bias and discrimination is not a static measure. While you may measure on static test data during the design phase, in real life situations your solution will be applied to new and changing populations. Just because a system is not biased or discriminatory initially, does not mean that it will continue to be if there is a change in the characteristics of that population who the solution is applied to in future.

This phenomenon is sometimes referred to as ‘concept/model drift’, and various measures exist for detecting it. For instance, you can measure the distance between classification errors over time; increasingly frequent errors may suggest drift. You should regularly assess drift and adjust the algorithm according to new data where necessary.

For more information about addressing bias, read ‘How should we address risks of bias and discrimination?’ in our guidance on AI and data protection.

Review our guidance on security to help you ensure that you have reasonable assurance of the technical and organisational security measures in place

Compliance with data protection security requirements may be more challenging for data analysis systems than for older, more established technologies. For example, your use of data analytics may rely heavily on third-party code or relationships with suppliers. 

As well as the technical security of your data analytics system, you should consider which organisational security measures you will need to implement, such as system-specific staff training and internal guidance.

Read our guidance on security in the Guide to the GDPR, and the ICO/NCSC Security Outcomes, for general information about security under data protection law. Though our security guidance focuses on general processing rather than law enforcement, the principles remain relevant across both regimes.

Read our guidance on the AI auditing framework for more specific information about security of AI systems under data protection law.

Consider whether further measures can be implemented to reduce the amount of personal data collected, or to enhance privacy

Modern data analytics can uncover previously undetected patterns, which can tempt you to collect more data in order to gain additional insights. However, before you start processing any personal data you must be clear about your purposes for using data analytics and what personal data is necessary for you to achieve those purposes. 

There are a range of techniques for enhancing privacy which you can use to minimise the personal data being processed at the training phrase, including:

  • peturbation or adding ‘noise’;
  • synthetic data; and
  • federated learning.

Read our guidance on the AI auditing framework for more information. 

You have considered how you can fulfil requests from individuals to exercise their rights

We have recently produced guidance on how individual rights apply to different stages of the AI lifecycle as part of our guidance on AI and data protection.

You have considered how human intervention can be meaningful, but may find the following guidance useful

When using data analytics in a law enforcement setting, human intervention and explainability is very important. Decisions informed by systems using data analytics could have a significant impact on an individual, and may be scrutinised by a court.            

To ensure that human intervention is meaningful, you should:

  • consider the necessary system requirements to support a meaningful human review from the design phase, particularly the interpretability requirements and effective user-interface design to support human reviews and interventions;
  • design and deliver appropriate training and support for human reviewers; and
  • give staff the appropriate authority, incentives and support to address or escalate individuals’ concerns and, if necessary, override the AI system’s decision.

The above list is not exhaustive. You should read our guidance on explaining decisions made with AI on how, and to what extent, complex AI systems might affect your ability to provide meaningful explanations to individuals.

Back to top