Elizabeth Denham's speech on the impact of people being more engaged in their digital privacy, and the importance of international engagement. Delivered at the Commonwealth Club, San Francisco

Good afternoon, and thanks for that kind introduction.

It’s great to be here in San Francisco.

It’s a privilege of my choice of career that I get to travel and spend time in places like San Francisco.

It’s a city that was familiar to one of my ancestors, my second cousin Stanley Edward Mockfort. His choice of career brought him to San Francisco in the 1950s, where he was based a few miles north of here. Unfortunately his choice of career was a car thief – running rings of stolen cars between states, and so he was here spending time in Alcatraz. Inmate Number 1067.

My son’s career also brought him here. Not as a car thief, I’m glad to report, but as a health app developer. Though I remember seeing his rented flat in the Castro disctrict and thinking that it probably wasn’t all that much bigger than my cousin Stanley had enjoyed.

And I’ve visited here professionally a few times too.

In fact, I was here for an international regulators conference in 2009, as Assistant Privacy Commissioner of Canada, where the topic of conversation was apps. I remember a colleague querying what an app was. It’s incredible to think that was only a decade ago.

But a lot has changed in privacy regulation since then.

We are in a transformative time in our digital history.

When I look at the cases on my desk today, the cases are unrecognisable from only five years ago.

The cases are more complex, for a start. My office is finalising a report on police extracting information from the mobile phones of victims of sexual assault. We’re investigating the adtech ecosystem for real time bidding, and live use of facial recognition technology by police and in the commercial sector.

The answer to ‘is this legal’ is rarely binary in such complicated cases.

The financial, reputational and human stakes are higher today too. My office’s intended enforcement action against Marriott and British Airways for breaches linked to poor cyber security runs to multi-million-pound fines.

Our regulation has an impact on trade deals, civil liberties and confidence in fair elections. And it has an impact too on innovation, a part of the equation I’m required to take into account under UK law.

And of course the cases are international. Last month I contributed to a data privacy day conference in Mexico. I was talking about my role as chair of the Global Privacy Assembly, a crucial group that brings together privacy regulators from every corner of the world, and gives me an incredible insight into the digital rights issues that unite us.

Last week I was in Canada, and now I’m here, meeting with tech companies, at Berkley with academics on Wednesday, and then with the Attorney General of California.

That’s what privacy regulation looks like today.

It’s international.

It’s high profile.

And it’s dealing with wickedly complex issues with real societal impact.

Contents

  • Today I want to talk about the impact of people being so much more engaged in their digital privacy.
  • I want to show how complex data privacy is today, and discuss a couple of important cases that my office is working on, which will have an international impact.
  • And I want to finish by talking a little about why I’m in San Francisco this week.

1. Privacy is mainstream

I think what most defines the world we in this room operate in today is that privacy isn’t going mainstream, it is mainstream.

If you watched the ads last Sunday you’ll have seen Facebook and Google and Apple keen to present their privacy credentials. Big tech companies understand that even if business is good, they need to keep consumers with them. Privacy is part of that. And companies should get credit when they do the right thing.

Research backs this changed picture of privacy. The Pew Research Center in Washington DC found that four in five Americans say that the potential risks they face because of data collection by companies outweigh the benefits. Time and time again we see that people want to wrestle back control and agency over their personal information.

That shouldn’t come as a surprise to anyone in this room. In fact, it’s probably one of the reasons you’re in this room.

And it isn’t just US citizens who are concerned about data privacy.

Last year at the G20 in Japan, government officials, business and regulators discussed concerns that different privacy regimes could pose barrier to data flows and trade. This concern underpinning the creation of the OECD’s fair information principles 40 years ago.

High stakes then. I hope you’ve all had plenty of coffee this morning!

So let’s talk about you in this room. What’s the impact on you of people caring about digital privacy?

I have heard the argument from some, that despite these concerns, there has not been an impact to businesses’ bottom lines.

A school of thought that says – yes -people care about their digital privacy, but – no - this concern isn’t reflected in their daily actions.

People know tech companies hold a huge amount of data about them, but they continue to use these platforms anyway. People want free digital services and free content online, and so they continue to hand over their data.

I’m not blind to that argument.

But a growing social movement resists the monetization of personal data. Many object to their personal information being used in a way that attempts to manipulate their actions.

Today we see increases in the number of class actions and group litigations– and here I’m thinking not so much about those linked to cyber security breaches, but actions launched in response to how their data is being used.

And so growing cynicism from consumers about how their information gets used, and monetised does not necessarily result in individual customers walking away from products, but in individuals supporting coordinated strategy to have more meaningful control over how their data is used.

The GDPR supports this effort toward international coalescence. The legislation’s design specifically includes provisions for extra-territorial reach. Consumers in Berlin might not have decided to withdraw their consent from apps delivered from Silicon Valley, but they did want a law that would allow German regulators to have their backs when they use those apps or launch complaints.

The CCPA

I think we’ve seen a similar social movement for better rights here in the US too.

California’s own Consumer Privacy Act is front and centre here.

When I talk to business leaders and data protection professionals in the US, it feels like the CCPA stirs both excitement and nervousness.

We’re seeing similar movements supporting digital privacy laws in other states. Washington looks to be next in the queue, with Nebraska, New Hampshire and Virginia all potentially heading in this direction too. And that’s before we consider potential extensions of COPPA, or even a development of anti trust laws into something more recognisable internationally as a federal data protection laws. And all feel consumer led.

The key question that will be asked of all of this legislation in due course is the question being asked of GDPR now. Has the law made a difference? Because the compulsory data protection officers, and the commissioner’s audit powers, and the stop processing notices… they’re all just tools, to ultimately lead to people having greater control over their data.

2. Data privacy today is complex

Legislation like the GDPR and the CCPA still have some way to go before they are making the difference they were drafted to achieve. But they are modern laws for a modern privacy age. And that’s crucial.

Because as I touched on earlier, today’s cases are more complex than they were even three years ago. That’s reflected in the files we’re looking at right now.

a. AADC

Take, for example, my office’s work around children and their digital experience.

Everyone agrees that we need to better protect kids from online harm.

That’s why the most important piece of work on my desk over the past year has been the ICO’s Age Appropriate Design Code.

Parents and guardians demanded greater protection for their children’s digital privacy. The GDPR requires special consideration for children, especially in regard to targeting and tracking. In the development of the UK’s application of the GDPR, Parliament passed a provision requiring the drafting, by the ICO, of an age appropriate design code. After a year of consultation, my office has published the code and the Secretary of State will soon lay the code before parliament.

Businesses selling into the UK will have to take account of the 15 standards in the Code, including behavioural advertising, geo location and high privacy settings by default. It doesn’t matter where in the world the company is located, this code applies if they are targeting and monitoring UK children.

In the coming decade, I believe children’s codes will be adopted by a great number of jurisdictions and we will look back and find it astonishing that there was ever a time that children did not have these mandated protections.

b. Adtech

Another example of the complex, high profile and international nature of privacy regulation is my office’s investigation into adtech and real time bidding.

Imagine I’m browsing a webpage. An advertisement appears in the sidebar for a tour of Alcatraz. That might raise an eyebrow – is it a coincidence that appears as I plan a trip to San Francisco?

Perhaps the big tech companies are conspiring to get me locked away. Perhaps they’ve heard about my cousin Stanley!

Of course it isn’t a coincidence. The process behind that ad being selected is the result of a complex, almost instantaneous bidding process between advertisers.

That process relies on enormous amounts of personal data.

And it’s processing of data that people are not aware of.

It’s a complex process, and making the necessary changes will not be easy.

But it’s an example too of how the ICO has to approach these wicked complex cases. We are in an era of digital innovation and ambition. It’s crucial that regulators show similar curiosity and innovation in their approach.

c. UK withdrawal from EU

I’ll briefly address the UK’s withdrawal from the EU here. The UK has exited the EU, and we are now in a transition period due to end on 31 December.

As a member of the EU, the UK had already adopted the GDPR into its own law. That law remains in place.

The UK government has stated its commitment to high data protection standards equal to those of the EU, as part of an independent policy on data protection. And an intention to pursue mutual adequacy with the EU.

So yes, I still have the same powers and responsibilities. Nothing has changed my remit.

3. Why engage with California

My office has to apply careful consideration to every case. Finding the balance between enforcing change and encouraging change. Understanding the implications of privacy regulation on other aspects like competition law. And making sure to appreciate the value of innovation.

As a regulator, I cannot forget that last point: the value of innovation.

Data protection law in the UK was born in the 1970s out of a concern that the potential of emerging technology would be lost if society didn’t embrace innovation. Regulation has a crucial role in reassuring people that they can support innovation, safe in the knowledge someone has their back.

That’s why my office has an open and constructive dialogue with the organisations we regulate.

Whether it’s our sandbox program, our grants program or our AI tools. Whether it’s the guidance documents on our website or the expertise of our recently hired specialists in data ethics, machine learning and economics. All are there to assist individuals and organisations understand and voluntarily comply with the laws.

We are a regulator that wants to listen.

I think that’s crucial. Because if we - businesses, regulators, citizens and legislators – cannot listen to each other – then the societal and commercial benefits of digital innovation, will be in peril.

The reverse is true too. The number of people in this room today, the invitations I have this week to meet with businesses, the interest in academia and in Sacramento, are all testament to an understanding of the importance of data privacy.

Privacy must be a team sport.

So I want to know about the work you’re doing that will impact UK citizens. I want to know the developments that are happening in air conditioned rooms in sunny San Fran, that will go on to affect the personal data of a woman walking down the street in rainy London.

I need to know more about what’s going on here. Good regulation needs the consent of the industries that are regulated.

So for businesses here today, don’t be a stranger. Take the opportunity of having that constructive, open relationship with us, and with other regulators. Watch our work. And we’re here to listen: ask us questions, let us hear from you.

Thank you for listening.