The ICO exists to empower you through information.

We have shared key questions organisations should ask when procuring AI tools to help with their employee recruitment.

Many recruiters may be looking to procure these tools to improve the efficiency of their hiring process, helping to source potential candidates, summarise CVs and score applicants. If not used lawfully, however, AI tools may negatively impact jobseekers who could be unfairly excluded from roles or have their privacy compromised.  

We recently audited several providers and developers of AI tools for the recruitment industry, which uncovered considerable areas for improvement, such as ensuring personal information is processed fairly and kept to a minimum, and clearly explaining to candidates how their information will be used by the AI tool. 

The regulator made almost 300 clear recommendations for providers and developers to improve their compliance with data protection law, all of which have been accepted or partially accepted. Published today, the audit outcomes report summarises the key findings from the audits, as well as practical recommendations for recruiters wishing to use these tools.

Ian Hulme, ICO Director of Assurance, said:

“AI can bring real benefits to the hiring process, but it also introduces new risks that may cause harm to jobseekers if it is not used lawfully and fairly. Organisations considering buying AI tools to help with their recruitment process must ask key data protection questions to providers and seek clear assurances of their compliance with the law.” 

Key questions to ask before procuring an AI tool for recruitment

1) Have you completed a DPIA? 

The procurement process is an important stage where recruiters can understand, address and mitigate any potential privacy risks or harms to people. A Data Protection Impact Assessment (DPIA) helps you to ask the right questions of your provider and should be carried out prior to using an AI tool, ideally at the procurement stage. Your DPIA should be kept up to date as the processing and its impacts evolve. Taking this approach will also help meet your accountability obligations under data protection law.

2) What is your lawful basis for processing personal information? 

If you are processing personal information, you must identify an appropriate lawful basis to rely on, such as consent or legitimate interests. If you plan to process sensitive special category data, such as racial, ethnic origin or health data, you must meet a specific condition under the law.

3) Have you documented responsibilities and set clear processing instructions?  

Both recruiters and AI providers have a responsibility for data protection compliance. You must identify who is the controller and processor of personal information. This must be recorded clearly in a contract with the provider and scrutinised carefully. If the AI provider is a processor, you must set explicit and comprehensive written instructions for them to follow. You should establish how you will ensure that the provider of the tool is complying with these instructions, and you could also set out performance measures such as statistical accuracy and bias targets.

4) Have you checked the provider has mitigated bias?  

The audit revealed that some AI tools were not processing personal information fairly – for example, by allowing recruiters to filter out candidates with certain protected characteristics. You must ensure that personal information will be processed fairly by monitoring for potential or actual fairness, accuracy or bias issues in the AI tool and its outputs. These should be raised with the provider so they can be addressed appropriately. You should seek clear assurances from the AI provider that they have mitigated bias and ask to see any relevant documentation.

5) Is the AI tool being used transparently? 

You must ensure that you inform candidates how an AI tool will process their personal information. You should do this with clear privacy information, that explains how and why the tool is being used and the logic involved in making predictions or producing outputs that may affect people. Candidates must be informed how they can challenge any automated decisions made by the tool.

6) How will you limit unnecessary processing? 

The audits revealed that some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. You must ensure that the tool collects only the minimum amount of personal information required to achieve its purpose and consider how you will ensure it is not used for other incompatible purposes.

We will be delivering a webinar on Wednesday 22 January 2025 for AI developers and recruiters so they can learn more about the findings and how they can be applied. Register for the webinar here.

For more information on our AI tools for recruitment outcomes report, click here.

Notes for editors

When an organisation is audited by the ICO, the regulator assesses whether it is compliant with data protection law and provides a report with recommendations on how to improve. Find out more here

  1. The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
  2. The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
  3. The ICO can take action to address and change the behaviour of organisations and individuals that collect, use, and keep personal information. This includes criminal prosecution, civil enforcement and audit.
  4. To report a concern to the ICO telephone call our helpline on 0303 123 1113, or go to ico.org.uk/concerns.