The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

Elizabeth Denham’s speech at the Oxford Internet Institute on 3 March 2021.

Original script may differ from delivered version.

As you have heard, I was due to deliver this lecture in person, twelve months ago.

March 2020 felt like a perfect time to talk with you about my office’s work. It felt like data privacy had become mainstream, and that my work was more relevant than ever.

Little did I know what the next year would bring.

Contact tracing apps. Health data stats headlining the news. Temperature checking at airports. Businesses recording the details of every customer. Employers wanting to know the health status of employees. Immunity passports and certificates.

My office has been at the centre of so many of the key issues that have had a very real impact on individuals over the past year.

And that’s before you factor in all the business as usual work, which became even more important alongside an unprecedented acceleration in people using data-driven services.

It feels like March 2021 is the perfect time to talk with you about my office’s work!

I couldn’t possibly cover all that work in a short speech, so today I want to talk to you about just a handful of cases, which I have chosen because they illustrate two things:

  • Firstly, they illustrate the challenges of modern data protection. The knotty questions and sometimes subjective judgments that privacy regulation can entail.
  • And secondly, these cases illustrate the value of data protection. Not as a tick box exercise, not as legal compliance, but as a means of securing public and consumer trust to maximise the benefits of innovation.

Before I start, a brief introduction to my role.

The ICO regulates data protection, as well as freedom of information law, and a handful of other statutes around information rights.

That’s the dry introduction, but the more interesting detail is in how we do it, and how we encourage trust in the way people’s personal information is used to support innovation, economic growth and societal benefits.

For instance, last year our helpline and written advice services handled nearly half a million enquiries.

That’s a LOT of people who were aware of their information rights and who cared enough about those rights to speak to the regulator.

And a substantial number of businesses who were aware of the rights their customers have, and who cared enough about supporting those rights to ask for our help.

While my work often gets headlines for enforcement action, like the £20m fine we handed to British Airways, most of my resources go into supporting organisations to comply with the law. That’s not just the guidance we produce, but also projects like our Sandbox, where we sit down with organisations looking at innovative uses of technology, and work on those knotty questions side-by-side.

Essentially, my role is around supporting trust in innovative data use.

Data protection law was born in the 1970s out of a concern that the potential from emerging technology would be lost if we didn’t embrace innovation. This is so relevant today: trust remains central to successful innovation.

The past year has been a prime example of the importance of trust.

Take the NHSX contact tracing app. My view is that the ICO was invited to be part of conversations around this from an early stage, not just because NHSX had a burning desire to be legally compliant, but because NHSX appreciated the importance of building an app that respected people’s data and kept their trust.

A lack of trust in the app would have meant a lack of engagement with the app. And the benefits the service offered society would have been lost.

As an interesting aside, I find it fascinating to see how society’s views on privacy shift and adjust to reflect the broader context. Who listening today, if asked last February, would have downloaded an app operated by the State, that tracked who you had been in proximity with?

But faced with the challenges of a global pandemic – and given the reassurances around privacy – the app was downloaded more than 21 million times.

The project worked because it had people’s trust.

Trust will be central as the government turns to the next big challenge: will people engage with vaccination passports. This is work that the ICO is again engaged on, both around the potential for a domestic scheme which could be used by employers or around a wider scheme for international travel.

Privacy questions must be considered at the earliest stage of development, but that does not mean there are easy answers.

What does a fair scheme look like? How much personal data is an individual being asked to share, if for instance, they have not had a vaccine because they are pregnant, or taking medication for an existing condition? And what prevents a scheme aimed at, say, those working in care homes, being adopted by someone running a restaurant?

These are data protection questions, but they are also societal questions. And societal questions, by their very nature, cannot be answered by government, or scientists, or regulators, in isolation. The only way to find answers – the only way to reach a solution that can encourage public trust – is to engage the public. To speak to the individuals whose data is being used. Consultation feels central here.

Consultation was certainly central in the ICO’s work in a different area across the last two years: the use of children’s personal information by online services.

This was been a crucial area of focus for my office. And a complex area too.

Let’s start with a quick question - if we were in the same room, I’d ask for a quick show of hands. How old should children be before they can use social media?

US law around children’s personal information has the effect of setting that age at 13.

Does that seem sensible?

But then the question gets more complex. What if someone’s news feed is increasingly pushing them in the direction of content about self-harm or extreme dieting? Does that make us want to move the age up?

How is that minimum age affected by apps that track the user’s location? Or use nudge techniques to encourage people to log on more, or to share more information about themselves?

It’s a wicked complex problem. The internet was not designed for children, but we know the benefits of children going online. We have protections and rules for kids in the offline world – but they haven’t been translated to the online world.

In the UK, our answer in part is the Age Appropriate Design Code. This code has been a big part of my office’s work over the past two years, and it is built on consultation with individuals: mothers and fathers and carers, teachers, developers, tech leaders, and of course children themselves.

The result is a code that requires businesses selling into the UK to take account of standards around issues like behavioural advertising, geo location and high privacy settings by default.

This week marks six months until the Code is fully in place, and there is plenty still to be done, both by my office in supporting organisations, and in businesses stepping up to make the necessary changes.

But that work will be worthwhile. The Code is an important piece of work in protecting children. In the coming decade, I believe children’s codes will be adopted by a great number of jurisdictions and we will look back and find it astonishing that there was ever a time that children did not have these mandated protections.

There is a more fundamental point here too: if we have a generation who grow up seeing digital services misuse their personal data, what does that do to their trust in innovation in the future?

Trust in innovation is crucial. I’ve seen first-hand over the past couple of years the loss of innovation when trust is lost.

Early in 2020, I was in San Francisco to meet with Silicon Valley tech leaders. The city is synonymous with innovation, and yet your own city likely has a technology that San Francisco doesn’t: police forces using facial recognition cameras.

Facial recognition technology allows police forces to overlay CCTV with software that can identify people from the shape of their brow or nose, and check whether they’re wanted for a crime.

Police forces in the UK have been quick to explore and trial facial recognition. It gives them the ability to identify a dangerous criminal in a crowd of thousands.

That’s great, right? – we all want criminals caught. But how much are you willing to give up for that increased safety? Because the technology also captures the image of thousands of other people, going about their lawful daily business.

This is a societal step change, that asks questions not only about data protection, but about society’s relationship with the police, and how far we’re willing to trust technological innovation.

In the UK, courts have declared police use can be lawful and reasonable. My office has also given our formal view. We have been clear that FRT processes sensitive personal data, and so the law places a high bar for its usage.

But in San Francisco, the city decided the best approach was a blanket ban on police use of the technology. People did not trust FRT, and so any benefits it may offer are lost.

That is an instructive example.

The UK economy, in common with much of the world, enters 2021 in recovery mode. Our experience of the past twelve months suggests that digital and data-driven innovation will be central to that recovery.

But we can only unlock the full benefits of innovation, if innovation benefits from the full support of the public.

From FRT to NHS contact tracing apps, I’ve seen that demonstrating good data protection compliance – like transparency, like fair use of data – is crucial in building and maintaining public trust that maximises the ongoing impact of innovation.

If the UK economy is to make the most of this moment of opportunity, then we need to think about how we ensure as many people as possible understand how their data contributes in a positive way.

We need to be building in privacy protections from the start. We need to be taking the time to explain to people how their data is being used, even when that is difficult. We need to be asking ourselves not ‘what can we do with people’s data?’, but ‘what is fair?’.

We need to be thinking about individuals.

I hope that has given you a good overview of the challenges and value of modern data protection regulation, and I’d be happy to take questions.