Skip to main content

John Edwards speaks at IAPP's Data Protection Intensive UK 2025

  • Type Speech

The Information Commissioner's keynote speech looks at how data protection can affect people's lives. 

Check against delivery.

Hello and thank you for having me at IAPP London. I’ve seen that Sir Chris Bryant is also speaking here today. I was grateful to Chris for speaking at our much smaller event last November – what came through to me there, as it does now, is his understanding of how privacy and data protection affects people. The positives and the harms. 

I know this audience, which I’ve spoken to three times now, also understands that privacy and data protection is about the real world impact of the way we care for and protect the personal information that is entrusted to us. You understand how data protection can affect people’s lives. 

This understanding underpins my approach to the role of Information Commissioner. 

Since I arrived in the UK and took up this role in 2022, my aim has been to regulate for outcomes, not outputs. I’m interested in real impact, real change, rather than products, things that might be easy to count, but that ultimately don’t improve the lives or experiences of people in the UK.

This philosophy informed arguably one of our most influential pieces of work last year – the ripple effect. We looked into how data breaches, which may seem like a small admin mistake to those in control, can have far-reaching consequences for people in vulnerable situations.

Imagine a person fleeing a violent domestic relationship, only to have their new address accidentally shared with their abuser.

Or someone who has their HIV status and confidential medical information disclosed without their consent.

These are real cases. They happen. And we wanted to change the conversation around these mistakes being seen as “admin errors”, to help organisations see that when they mishandle personal information, it can have a ripple effect of damage and distress far beyond their understanding.

We spoke directly to people affected by these data breaches. 55% had had their data stolen or lost, and almost 70% experienced negative impacts as a result.

We created resources – flyers, email signatures, social media content – for organisations to raise awareness of the importance of ensuring people’s information was protected. And I went to the Global Privacy Assembly with my fellow DPAs and regulators, and gave a presentation on this campaign to showcase our work.

As data protection professionals, I don’t have to tell you how important it is to keep people’s personal information safe. It’s in the law, it’s in your job descriptions. But sometimes it’s helpful to remember that there’s a person behind all this. Data protection is not just about computers, numbers and legislation – it's about people.

And that message, that key philosophy, underpins all of the work my office does.

Because data protection is not just one person’s job. It’s not just the job of data protection officers. Although your role is integral, and I’m grateful to you for coming to this conference to hear from experts in the field to improve your practices.

The privacy legal community and their advisers have a role to play here. The consumer can also play a role, but there is a limit. It’s the responsibility of the whole organisation, from the C-suite down, to keep things simple and accessible for their users.

And of course my office plays a role too.

The public should not be expected to have to read reams of legalese in a privacy notice to understand what’s happening to their information. 

They should not be kept in the dark about where their information’s going and what it’s being used for. 

And they should not be coerced or forced into handing over their information without being fully informed about the consequences.

Last week we announced the next phase of our children’s privacy work, along with our ongoing investigations into TikTok, Reddit and Imgur.

Our  role is to ask questions on behalf of young people, so they don’t have to make complex decisions about how they transact with their personal information in an asymmetric information relationship. We must act on behalf of, and as a proxy for the parents and guardians of young people who themselves have only a limited ability to make informed choices on their behalf. 

How do these platforms protect children’s personal information? How do their recommender systems work, how are they using a 13 year old’s preferences, viewing habits, shares and likes to serve them content, and keep them on the platform? And, in the case of Reddit and Imgur, how do they assess the age of their users and tailor their content accordingly?

In our new research, we found almost half of British parents felt they had little to no control over the information social media and video sharing platforms are collecting about their children – and the same number again feel unable to explain it to their kids. 

Children shouldn’t have to figure this out by themselves. It’s not their job. The platforms they use should make this information easily available and accessible for them.

So we’ve stepped in, because it’s our job as the UK’s data protection regulator to hold these platforms to account. If social media and video sharing platforms want to benefit from operating in the UK, they must comply with data protection law.

You may be sitting there thinking, so what? The ICO are investigating three of the biggest social media and video sharing platforms. Smaller companies will be able to still operate with impunity.

But this would be a mistake. 

I want to make it very clear. By focusing our efforts on some of the largest, most well known platforms, we are not giving smaller companies a free pass to adopt or continue unlawful practices. Instead, last week’s announcement should serve as a warning shot. All organisations using children’s data, or who offer products or services aimed at young people, need to comply with the law and conform with our Children’s code.

We want organisations to see our investigations and take it as a sign to get your own house in order. You shouldn’t wait for the regulator to come knocking on your door before checking your processes. 

Before opening up for questions from the floor, here are a few other things coming across my desk at the moment.

Our focus on AI and biometrics continues. We’re working to make it easier for you to unlock the potential of AI to transform your business, while still protecting the rights of your users or customers. I’ve asked my team to look at foundation models, automated decision-making in recruitment and by government and how police forces use facial recognition technology. We’ll be closely scrutinising any proposed deployment of predictive profiling that could affect people’s rights. Keep an eye out for our upcoming thoughts on these.

We’re also looking ahead at the technologies and innovations that are likely to burst onto the scene in the next two to seven years. By setting out our stall early, highlighting and mitigating some of the potential data protection considerations before the technologies are rolled out, we can provide timely regulatory advice and support to innovators.

We don’t have a crystal ball, but it’s worth reading our report to consider whether you’ll be working with any of these emerging technologies.

It’s our job to help organisations choose the UK to develop their technologies, to view it as an innovation-friendly, privacy-preserving regulatory landscape.

We’ve also made strides in our online tracking work. My office set out a clear vision for what we want to see: a fair and transparent online world, where people are given meaningful control over how they are tracked online. People need to trust the digital services they use and make informed choices about who they share their information with, and how it’s used.

A bit of audience participation to end. Hands up ...

How many agree that people should be able to make informed choices about who they share their info with, and how it’s used?

An easy one now: How many of you work for companies with websites? 

And, last one: How many of you can say now, with complete confidence, that your websites are clear, and give people complete transparency around how their data is used?

That’ll be one part of our work programme over the next year. And why our eyes will be on the top 1,000 websites, those that rely on digital advertising, and the data that involves, the principle of that commitment to transparency from the first moment of interaction with your customers should be relevant to everyone in this room.

It’s our job to work with organisations in the online space to ensure they have support to change their practices and provide the best, most privacy-preserving experience for their users.

My key message will always be this. Remember the person behind the data. Don't try and push onto them the responsibility for your compliance. It’s not their job.

We must work together – the ICO, you, the privacy profession and wider organisations – to respect people’s personal information and let them get on with enjoying the fruits of a trust-based digital economy.

Thank you.