The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

Abigail Hackston, ICO Senior Policy Officer – Innovation looks back on the ICO’s work with organisations who use personal data and artificial intelligence (AI) to support, or to make decisions about, individuals.

30 July 2021

It is estimated that AI has the potential to help UK GDP grow by between 10 and 22 per cent by 2030. But with AI so often relying on people’s personal data to fuel innovation, those benefits are only possible where people trust their information will be used fairly.

To help organisations achieve this, the ICO worked with The Alan Turing Institute to produce our ‘Explaining decisions made with AI’ guidance, published in May 2020. This aimed to “develop a framework for explaining processes, services and decisions delivered by AI, to improve transparency and accountability.”

The guidance was produced as a best practice document to help organisations of various sizes from different sectors.

We were clear from the outset that we wanted to make this guidance as practical and useful as possible. This reflects a key ICO aim to enable innovation and economic growth in the AI sector. We can only do this if the guidance we produce is practically applicable to real life situations.

To understand the impact that the guidance has had during the last year, we consulted with 56 organisations who make decisions about their customers using personal data and AI. This group included SMEs, public sector organisations and established technology organisations.

We asked these groups to tell us what worked well, what could be improved, and whether they had any further comments on the guidance.

The feedback was positive, and we are pleased that participants found the guidance useful and of high quality. We heard that the guidance provides a good foundation for improving awareness and understanding of the need for explanations relating to AI systems, and how to construct those explanations.

Respondents also said the guidance clearly defined the key elements needed to build explainable AI systems and when further detail was needed this was also easy to understand.

Areas the consultation identified to improve on included the length of the guidance. To address this point and ensure the key parts of the guidance are quickly accessible, we have added the “at a glance” sections separately alongside the guidance as a summary document. This pulls the fundamental elements of the guidance into one place and makes it easier to find them quickly.

There were concerns raised that some parts of the guidance were not particularly useful for SMEs. It was felt that the level of detail given was more suited to larger organisations, and some additional support may be needed for organisations that do not have the technical capabilities to build and support an AI system in-house. If you run an SME that processes personal data using AI, remember that you can get additional support from the ICO’s SME web hub.

We will add some case studies to the guidance, so that organisations can reference some practical examples of good practice in action. Our consultees indicated that this would be a valuable addition. If you feel you have an example of a good case study we might use, we would love to hear from you. Please get in contact with us via AI@ico.org.uk.

We are pleased that the feedback we received indicates that the guidance is having a material impact among the organisations who participated in this consultation. These comments give the ICO confidence that the guidance will be an effective tool for those who choose to use it.

The ICO will take the research findings into account in our future work on AI and in our production of further guidance on this and other related topics.

Abigail Hackston is a Senior Policy Officer for Innovation at the ICO.