The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.


A blog by Stephen Bonner, ICO’s Executive Director of Regulatory Futures and Innovation

25 August 2021

The transition year is up and the Children’s code comes fully into force on 2 September. It’s a ground breaking code that creates a better internet for children by ensuring online services likely to be accessed by children, respect a child’s rights and freedoms when using their personal data.

As you’d expect it’s already having an impact on these services. Facebook, Google, Instagram, TikTok and others have all made significant changes to their child privacy and safety measures recently.

As the first-of-its kind, it’s also having an influence globally. Members of the US Senate and Congress have called on major US tech and gaming companies to voluntarily adopt the standards in the ICO’s code for children in America. The Data Protection Commission in Ireland is preparing to introduce the Children’s Fundamentals to protect children online, which links closely to the code and follows similar core principles.

Post 2 September the risks to children are not removed overnight, and the work doesn’t stop.

We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites and video gaming platforms. In these sectors, children’s personal data is being used and shared, to bombard them with content and personalised service features. This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online. We’re concerned with a number of harms that could be created as a consequence of this data use, which are physical, emotional and psychological, and financial.

Children’s rights must be respected and we expect organisations to prove that children’s best interests are a primary concern. The code gives clarity on how organisations can use children’s data in line with the law, and we want to see organisations committed to protecting children through the development of designs and services in accordance with the code.

We will be proactive in requiring social media platforms, video and music streaming sites and the gaming industry to tell us how their services are designed in line with the code. We will identify areas where we may need to provide support, or should the circumstances require, we have powers to investigate or audit organisations.

Separately, we are considering how organisations in scope of the Children’s code can tackle age assurance, whether that’s verifying ages or age estimation. The ICO will be formally setting out its position on age assurance in the autumn.

Our commitment to working with other regulators through the Digital Regulation Cooperation Forum (DRCF) will help ensure consistency between the code and the incoming online safety laws that will jointly protect children online.

Ultimately the Children’s code will help industry innovate to ensure that the best interests of the child are a primary concern online and built into the design from the beginning. This will grow the trust between online services, children, parents and society.

Stephen bonner

Stephen Bonner – Executive Director (Regulatory Futures and Innovation). Stephen joined the ICO to lead our work developing our capacity and capability to regulate new and emerging technologies and innovations. He leads programmes of work to develop strategic ICO positions, based on horizon scanning and research, on technology issues such as data, supervision of the large technology platforms now in our remit, online harms, the Digital Markets Unit and delivery of the DRCF (Digital Regulatory Cooperation Forum) workplan. He is also leading on the implementation of the Children’s code.