The ICO exists to empower you through information.

“There are laws to protect children in the real world. We need our laws to protect children in the digital world too.” – UK Information Commissioner

Today the Information Commissioner’s Office has published its final Age Appropriate Design Code – a set of 15 standards that online services should meet to protect children’s privacy.

The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

Elizabeth Denham, Information Commissioner, said:

“Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

The code says that the best interests of the child should be a primary consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.

Ms Denham said:

“One in five internet users in the UK is a child, but they are using an internet that was not designed for them.

“There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.

“In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind.”

The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn 2021.

This version of the code is the result of wide-ranging consultation and engagement.

The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings with individual organisations, trade bodies, industry and sector representatives, and campaigners.

As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

Ms Denham said:

“The ICO’s Code of Practice is the first concrete step towards protecting children online. But it’s just part of the solution. We will continue to work with others here in the UK and around the world to ensure that our code complements other measures being developed to address online harms.”

Notes to Editors

Next steps

The Secretary of State will now need to lay the code before Parliament for its approval as soon as is reasonably practicable.

Before that, the Government intends to notify the European Commission of the code under the requirements of the Technical Standards and Regulations Directive 2015/1535/EU, and observe the resultant 3 month standstill period. Any queries about this referral process should be directed to DCMS press office at [email protected]. This is because the obligation to notify the European Commission falls upon the UK as a Member State of the European Union, rather than on the ICO, and is therefore a matter for Government.

Once the code has been laid it will remain before Parliament for 40 sitting days. If there are no objections, it will come into force 21 days after that. The code then provides a transition period of 12 months, to give online services time to conform.

The next phase of the ICO’s work will include significant engagement with organisations to help them understand the code and prepare for its implementation.

Changes to the code as a result of consultation

Key changes include:

  • Clarification of the need to adopt a risk-based and proportionate approach to age verification
  • Clarification of what services are considered to fall within the code because they are “likely to be accessed by children”
  • Clarified our approach to enforcement as risk-based and proportionate
  • FAQs specific to the media industry
  • The introduction of a 12 month transition period – the maximum allowed.

A full summary of the ICO’s response to consultation feedback is attached to this media pack.

Background to the code

The Government included provisions in the Data Protection Act 2018 to create world-leading standards that provide proper safeguards for children when they are online.

As part of that, the ICO is required to produce an age-appropriate design code of practice to give guidance to organisations about the privacy standards they should adopt when offering online services and apps that children are likely to access and which will process their personal data. (A link to the Parliamentary debate, led by Baroness Kidron, is here.)

The standards in the Code were backed by existing data protection laws which are legally enforceable and regulated by the ICO. The regulator has powers to take action against organisations that break the law including tough sanctions like orders to stop processing data and fines of up to £17million or 4% of global turnover.

The first draft of the code went out to consultation in April 2019. It was informed by initial views and evidence gathered from designers, app developers, academics and civil society. You can read the responses here.

The ICO also sought views from parents and children by working with research company Revealing Reality. The findings from that work are here.

General information about the ICO

The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).

Since 25 May 2018, the ICO has the power to impose a civil monetary penalty (CMP) on a data controller of up to £17million (20m Euro) or 4% of global turnover.

The GDPR and the DPA2018 gave the ICO new strengthened powers.

The data protection principles in the GDPR evolved from the original DPA, and set out the main responsibilities for organisations.

To report a concern to the ICO, go to ico.org.uk/concerns.