The ICO exists to empower you through information.

This section is aimed at senior management and those in compliance-focused roles, including DPOs, who are accountable for the governance and data protection risk management of an AI system. You may require a technical specialist to explain some of the details covered in this section. 

Control measure: There has been full consideration of the controller, processor or joint controller relationship throughout the supply chain in the use of AI systems. The decision reached on the controller and processor relationship across all proposed processing activities is documented.

Risk: If no decision on  the controller and processor relationship has been made, it is likely that all parties will fail to meeting their obligations under the UK GDPR.

Ways to meet our expectations:

  • Identify the distinct sets of processing operations and their purposes in order to understand the relationship.
  • Consider the whole supply chain within the assessment.
  • Evidence the consideration of the relationship between all parties (eg in emails, meeting minutes, model design or specification documents).
  • Include a requirement within DPIA templates to assess the relationship.
  • Ensure considerations and conclusions are in line with ICO and sectoral or EU guidance on the role of controllers and processors.
  • Formally document and agree the relationship within contracts or agreements.
  • Communicate the relationship in privacy information.

Options to consider:

  • Use information flow mapping exercises to help assess the distinct sets of processing operations and the decision makers in each activity.

 

Control measure: Before procuring AI systems, datasets or coding, there has been appropriate due diligence undertaken on accuracy, bias and the trade-offs that have been considered in the design. 

Risk: If due diligence is not undertaken, there will be no assurance on the system’s ability to meet data protection requirements or the information’s accuracy and source.

Ways to meet our expectations:

  • Decide the acceptable level of accuracy before procurement and complete due diligence to understand and confirm this.
  • Seek guarantees from the provider of the system or dataset on the source of the information, coding or models used to build the AI, to help you determine its accuracy.
  • Review the use of the AI system or information against existing systems, products or services to ensure that it does not impact on the accuracy of outputs in deployment.
  • Include accuracy-based key performance indicators (KPIs) or SLAs in written contracts with third-party suppliers.
  • Complete due diligence to understand the level of bias and discrimination that can be expected from the AI system and do not procure the services or datasets if bias or discrimination cannot be mitigated.
  • Choose AI models that offer transparency and explainability. This allows all parties to understand how the model makes decisions, making it easier to identify and address potential biases.
  • Conduct an independent evaluation of any trade-offs as part of the due diligence process.
  • Determine what would be the reasonable timeline to re-evaluate the accuracy of the system and ask the third party to re-train the AI models if required (ie AI systems may need updating with new information to maintain their accuracy especially when the trends in information are changing).
  • Engage independent third-party organisations or experts to conduct audits on your AI system. 
  • Request comprehensive documentation from the model developer, including information on the training process, feature selection, hyperparameter tuning, and any constraints imposed on the model. 
  • Obtain evidence of fairness assessments conducted during the model development. This may include fairness metrics, demographic parity analysis, and other techniques used to evaluate and address biases.
  • Evaluate reports or documentation on model explainability. Understand how the model's decisions can be interpreted and how the system addresses the trade-off between complexity and interpretability. You should receive clear explanations about  complex or black box models from the provider.
  • Ask for evidence of user testing and feedback to understand how different user groups experienced the system, and whether trade-offs had a disparate impact on certain user categories. 

Options to consider:

  • Regularly review any outsourced services and modify them or switch to another provider, if their use is no longer compliant in your circumstances.
  • Assess any new risks and compliance considerations that may arise during the full course of the deployment.
  • Provide relevant documentation to support the due diligence process, such as a privacy policy, record management policy and information security policy.
  • Request the model developer provides algorithmic impact assessment reports that detail the decisions made during the system's design, including trade-offs between different objectives (eg accuracy vs’ fairness). These reports should include an evaluation of potential biases and their impact. 

 

Control measure: There are written contracts in place with third parties that clearly identify the controller and processor roles and responsibilities of each party, and include details of information processing.

Risk: Without appropriate contracts in place, there is a risk that breaches of controller and processor requirements cannot be assessed or attributed. There may be a lack of understanding of how personal information is being processed by third parties. With only verbal agreements, there is a lack of recourse if there is a breach of UK GDPR requirements. This may breach UK GDPR articles 28 and 5(2).

Ways to meet our expectations:

  • Identify the distinct sets of processing operations and their purposes in order to understand the relationship.
  • Ensure written contracts clearly identify the controller(s) and processor(s) relationships and their specific responsibilities, including who decides the purposes and means of the processing in practice.
  • Include clear, specific, and detailed written instructions within contracts that state that the processor must only act on the controller’s documented instructions, unless required by law to act without them.
  • Obtain senior management approval and ensure contracts are signed and dated by both parties.
  • Ensure there is written authorisation and a contract in place with all sub-processors.
  • Draft contracts to include comprehensive details of the processing:
    • the subject matter of the processing;
    • the duration of the processing;
    • the nature and purpose of the processing;
    • the type of personal information involved;
    • the categories of people;
    • the controller’s obligations and rights;
    • the decision-making boundaries of each party; and
    • the agreed technical and organisational controls and settings for the AI system.
  • Include terms or clauses stating that:
    • the processor must only act on the controller’s documented instructions, unless required by law to act without them;
    • the processor must ensure that people processing the information are subject to a duty of confidence;
    • the processor must only engage a sub-processor with the controller’s prior authorisation and under a written contract; and
    • the processor must take appropriate measures to help the controller respond to requests from people to exercise their rights.
  • Include the technical and organisational security measures the processor will adopt, including: 
    • encryption; 
    • minimisation or pseudonymisation; 
    • resilience of processing systems; and 
    • backing up personal information in order to be able to reinstate the system.
  • Include clauses to instruct processors to delete or return all personal information at the end of the contract, unless the law requires its storage.
  • Include clauses to ensure that the processor assists in meeting UK GDPR obligations for the security of processing, the notification of personal data breaches and DPIAs.

Options to consider:

  • Create standard template contracts that include all the relevant clauses under UK GDPR.

 

Control measure: There are in-life contract monitoring or one-off arrangement reviews to ensure partners abide by agreements.

Risk: If agreed roles and responsibilities between controllers, processors and joint controllers are not being undertaken in practice, there is a risk that documented agreements, terms and conditions or contracts are in breach and there is a lack of control over who does what in the management of the AI system. This may breach UK GDPR articles 5(2), and 28(1) and (3).

Ways to meet our expectations:

  • Have contracts that are timebound and reviewed periodically, and conduct routine in-life compliance checks to ensure they remain up-to-date.
  • Document a process for managing the ongoing relationship between all parties.
  • Review the practical day-to-day management of the AI system to provide assurance that the agreed roles and responsibilities are fulfilled and there are no discrepancies or role creep.
  • If role or responsibility creep has occurred, undertake an assessment of existing agreements and implement changes to contracts, as appropriate.
  • Include clauses within contracts to allow audits or checks to confirm all parties are complying with all contract terms and conditions.

Options to consider:

    • Maintain all contracts in a central log or system so you  can readily monitor and review them.
    • Review any contracts put in place before the current data protection regime to update the relevant details, terms, clauses now required.
    • Have an appropriate approval process for contracts.