A blog by Stephen Bonner, Deputy Commissioner – Regulatory Supervision
19 January 2023
So many of people’s interactions with the government, both local and central, involve us handing over data about ourselves. This could be as simple as our name or date of birth, or as personal as our financial history or health information.
People should feel confident that this data is handled appropriately, lawfully, and fairly. This should especially be the case when accessing welfare or social support, where an individual may be at their most vulnerable. They should also be confident that none of their personal data is being used to discriminate against them, either consciously or unconsciously.
When concerns were raised about the use of algorithms in decision-making around benefit entitlement and in the welfare system more broadly, we conducted an inquiry to understand the development, purpose and functions of algorithms and similar systems being used by local authorities. We wanted to make sure people could feel confident in how their data was being handled.
As part of this inquiry, we consulted with a range of technical suppliers, a representative sample of local authorities across the country and the Department for Work and Pensions. Overall 11 local authorities were identified through a risk assessment process to ensure a representative sample based on geographical location and those with the largest benefits workload. This inquiry has greatly increased our understanding of the development, practical application and use of this technology in this sector, and the findings will be fed into the ICO's wider work in this area.
In this instance, we have not found any evidence to suggest that claimants are subjected to any harms or financial detriment as a result of the use of algorithms or similar technologies in the welfare and social care sector. It is our understanding that there is meaningful human involvement before any final decision is made on benefit entitlement. Many of the providers we spoke with confirmed that the processing is not carried out using AI or machine learning but with what they describe as a simple algorithm to reduce administrative workload, rather than making any decisions of consequence.
It is not the role of the ICO to endorse or ban a technology, but as the use of AI in everyday life increases we have an opportunity to ensure it does not expand without due regard for data protection, fairness and the rights of individuals.
While we did not find evidence of discrimination or unlawful usage in this case, we understand that these concerns exist. In order to alleviate concerns around the fairness of these technologies, as well as remaining compliant with data protection legislation, there are a number of practical steps that local authorities and central government can take when using algorithms or AI.
-
Take a data protection by design and default approach
As a data controller, local authorities are responsible for ensuring that their processing complies with the UK GDPR. That means having a clear understanding of what personal data is being held and why it is needed, how long it is kept for, and erase it when it is no longer required. Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. This includes any processing carried out by an organisation or company on their behalf. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.
-
Be transparent with people about how you are using their data
Local authorities should regularly review their privacy policies, and identify areas for improvement. There are some types of information that organisations must always provide, while the provision of other types of information depends on the particular circumstances of the organisation, and how and why people’s personal data is used. They should also bring any new uses of an individual’s personal data to their attention. -
Identify the potential risks to people’s privacy
Local authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. Our DPIA checklist can help when carrying out this screening exercise.
The potential benefits of AI are plain to see. It can streamline processes, reduce costs, improve services and increase staff power. Yet the economic and societal benefits of these innovations are only possible by maintaining the trust of the public. It is important that where local authorities use AI, it is employed in a way that is fair, in accordance with the law, and repays the trust that the public put in them when they hand their data over.
We will continue to work with and support the public sector to ensure that the use of AI is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public.
Stephen Bonner – Executive Director (Regulatory Futures and Innovation). Stephen joined the ICO to lead our work developing our capacity and capability to regulate new and emerging technologies and innovations. He leads programmes of work to develop strategic ICO positions, based on horizon scanning and research, on technology issues such as data, supervision of the large technology platforms now in our remit, online harms, the Digital Markets Unit and delivery of the DRCF (Digital Regulatory Cooperation Forum) workplan. He is also leading on the implementation of the Children’s code.