Protecting children's privacy online: Our Children's code strategy
Introduction
Safeguarding children’s personal information is a key priority
Children are growing up in a digital world; being online and using digital services is an integral part of their lives where they learn, play and connect with others. However, the online world was not designed with children’s needs in mind and, as a result, the potential for harm is ever-present.
Children may be less aware of the risks, safeguards and their rights about the use of their information. When organisations fail to use children’s information properly, it could leave them vulnerable to:
- being inappropriately identified or targeted by strangers;
- having their location tracked; or
- being sent harmful communications.
We want to see an internet that is privacy-friendly and safe for children and young people.
Our Children’s code explains how organisations can make sure their digital services safeguard children's personal information. It enables organisations to develop services and products that comply with UK data protection law. Since its introduction, we have been working to ensure organisations design their online services with children’s privacy in mind. And that is something children want too. During our evaluation of the code in 2023, around seven out of 10 children told us that they trust the code to make the internet better and safer for them.
We have prompted significant changes since the code was introduced, but we know we still have more to do.
This strategy summarises our priorities for the next phase of our work, including how we will continue to enforce the law and conformance with our Children’s code. It supports our ICO25 strategic enduring objectives of safeguarding and empowering people and promoting responsible innovation and sustainable economic growth.
Further reading
What we have achieved since the Children’s code came into force
The Children’s code was fully rolled out in September 2021, requiring online services including websites, apps and games to provide better privacy protections for children, ensuring their personal information is protected within the digital world.
In the run up to the code coming into force, both big tech and smaller online services had already started to make significant changes, including:
- Facebook and Instagram limited the information used in targeted advertising for under 18s to age and location. They committed not to personalise adverts for under 18s based on their on or off-app activity. Both Facebook and Instagram started asking for people’s date of birth at sign up, preventing them from signing up if they repeatedly enter different dates, and disabling accounts where people can’t prove they’re over 13. Instagram also launched parental supervision tools, along with new features like Take A Break to help teens manage their time on the app.
- YouTube turned off autoplay by default and turned on take a break and bedtime reminders by default for Google Accounts for under 18s.
- Google enabled anyone under 18 (or their parent/guardian) to request to remove their images from Google image search results; location history cannot be enabled by Google accounts of under 18s; and they have expanded safeguards to prohibit age-sensitive ad categories from being shown to these users.
It was a strong start, but protecting children’s personal information isn’t about one company or activity and neither is our approach. In the two years since the Children’s code was introduced, we have empowered many organisations through advice and guidance. At the same time, we’ve delivered impactful interventions to drive compliance with the law, using our regulatory powers where necessary. We have listed some of what we have achieved below.
We have also gained insight through research and discussions with children, parents and carers, and stakeholders. To help empower children, we produced a suite of school resources for teachers to use across all age groups when discussing privacy issues and the value of personal information. We engaged with apps, websites, games, children’s rights advocates, academics and students through our transparency champion work. Participants shared their vision to meet the code’s transparency standard and help children understand how their personal information is used.
Providing regulatory certainty through advice, guidance and certification
- Between September 2020 and December 2022, we held 56 stakeholder events to familiarise participants with the Children’s code, including workshops, webinars, forums, conference appearances, panels and one-to-one meetings. We adapted events to online delivery to enable organisations to continue to access advice and guidance during the Covid-19 pandemic.
- We have provided clarity to organisations through a range of written guidance and advice:
- Our likely to be accessed guidance helps organisations to determine whether children are likely to access their online service, even if they run an adult-only service.
- Our recently updated opinion on age assurance provides up-to-date information on technological, policy and legal developments to ensure children receive age-appropriate services.
- Our education technologies (edtech) guidance, which is supported by the Digital Futures Commission and 5Rights, provides clear guidance for those procuring and providing the tech that children access in schools.
- Our testing tool for UX designers and our updated risk assessment toolkit help organisations check whether they are conforming with UK data protection law and the code. It gives practical steps to enable a proportionate and risk-based approach to children’s privacy.
- We are also leading the way globally with our certification schemes. These provide a framework to support organisations to conform with the Children’s code and offer assurances that they are meeting specific standards. We approved the ACCS Age Check Verification Scheme, which has gone on to certify seven other organisations, meaning that they meet the data protection standards for age assurance.
Applying regulatory scrutiny and enforcement to uphold compliance
- We engaged with 55 organisations, including some of the world’s largest games and social media platforms. Of these, we:
- Audited 11. This included online games companies that produce some of the world’s most popular games. What we found prompted us to create our top tips for games designers to help ensure they protect children playing games online and comply with data protection laws. We also audited age assurance providers and organisations. This helped us to better understand how industry is estimating or verifying a child's age; and
- Assessed 44 for how they conformed with the Children’s code standards. We have seen organisations make positive changes to help protect children's information online. We closed a number of cases with advice on what organisations should improve. We are still investigating some organisations.
- In addition, we have used our enforcement powers to protect children’s personal information, including issuing a fine of £12.7m to TikTok for a number of breaches of data protection law, including failing to use children’s personal data lawfully. TikTok has appealed the findings of infringement and the fine and the case is currently in litigation.
- We have investigated Snap's risk assessment process in relation to its 'My AI' generative AI chatbot, with a particular focus on ensuring risks to children are appropriately identified and mitigated.
Working with other regulators and partners
- We have worked with experts from children and family services, education, academia, non-governmental organisations (NGOs) and industry, including through our Children’s Advisory Panel, drawing on their experience of working with and representing children and families to inform our work.
- We continue to work closely with Ofcom and other regulators, including through the Digital Regulation Cooperation Forum. We work with Ofcom to maximise coherence between the data protection and online safety regimes and to promote compliance with them. Developing an aligned approach on age assurance has been a particular priority for our joint work to protect children online. We consulted with Ofcom in the development of our updated opinion on age assurance (referenced above) earlier this year to reflect developments in policy, technology and legislation.
- As part of our collaborative work we responded to Ofcom’s consultations on protecting people from illegal harms online and draft guidance for service providers publishing pornographic content.
- We also consulted with Ofcom ahead of the publication of our guidance on content moderation. This guidance was developed to help organisations understand their data protection obligations and is the first in a series of guidance products on online safety technologies including user profiling and behaviour identification that we are currently working on.
Influencing change internationally to raise standards for UK children
- Our Children’s code is having an impact across the world. It’s inspiring more companies outside the UK to make changes, and more countries to protect children online. They are using the code as a blueprint for their own laws and regulations. We have seen similar codes of practice being introduced across the world, including many US states, Australia, Chile, Sweden and Ireland. This means even stronger digital protections for children, wherever they are based.
- We meet regularly with our international counterparts to:
- maintain the international focus on children’s privacy;
- promote our Children’s code; and
- influence the developing international approaches to protect children’s information online.
- Protecting children’s privacy spans multiple regulatory remits. This is why we work, collaborate and share expertise in areas of mutual interest with other regulators. For example, we established and chair the International age assurance working group. This enables data protection authorities to share information on age assurance and learn from research, policy development and enforcement action.
Next steps
Our focus in the coming months will be on driving social media and video-sharing platforms to make further progress.
The majority of children use social media and video sharing platforms 1. Research shows that 96% of children aged 3-17 watch videos on video sharing platforms. The proportion of children using social media platforms increases significantly with age: 30% of children aged 5-7 years, 63% of children aged 8-11 , rising to 93% of children aged 12 to 15, and 97% of those aged 16-17 use social media platforms 2.
While we have seen progress made by such platforms, their processing of children’s personal information may not always be in children’s best interests. More needs to be done to keep children’s personal information safe. Research and engagement, evidence from academics and civil society, and learning from other regulators and civil court cases, suggest that using these platforms can increase the potential for significant harm to children. This is attributable to how platforms process children’s personal information.
In light of this, we will focus in 2024- 2025 on social media and video sharing platforms and take a closer look at the following areas in particular:
-
Default privacy and geolocation settings. Many children’s profiles use the default privacy and geolocation settings provided by platforms. However, the ability to track a child’s location information risks it being misused to compromise their physical safety or mental wellbeing. The safety of children online is of utmost importance. This is why children's profiles should be private by default and geolocation settings should be turned off by default.
- Profiling children for targeted advertisements. Children may not be aware that their information is being gathered or that it can be used to tailor the adverts they see. This may impact children’s autonomy and control over their personal information. It could lead to financial harm, where adverts encourage in-service purchases or additional app access without adequate protections in place. Unless there is a compelling reason to use profiling, the Children’s code states it should be turned off by default.
- Using children’s information in recommender systems. Algorithmically generated content feeds may use information such as behavioural profiling and analysis of children’s search results. These feeds may create pathways to less suitable content, potentially including self-harm, suicide ideation, misogyny or eating disorders. The design of recommender systems may also encourage children to spend longer on a platform than they otherwise would. This leads to children providing platforms with further personal information.
- Using information of children under 13 years old. Children under the age of 13 cannot consent to an online service processing their personal information; parental consent is required. How services gain consent, and how they use age assurance technologies to assess the age of the user and apply appropriate protections, are important. These can mitigate the potential harms facing children from the use of their personal information.
Our Children's code strategy seeks to ensure that social media and video sharing platforms comply with data protection law and conform with the standards of the Children’s code. They can do this by designing data protection safeguards into their online services to ensure they are appropriate for children’s use and development needs.
We will focus on identifying the most serious risks to children's privacy in these areas and work to reduce or eliminate them.
Our work will include:
-
Evidence gathering:
-
We will continue to gather evidence and develop our insight and expertise in the areas outlined and the associated data protection harms. As part of this, we plan to publish a call for evidence in summer 2024 to invite input from a range of stakeholders.
-
We will identify the key social media and video sharing platforms that we should engage with further in our supervision and engagement programme.
-
-
Engagement:
- We will engage with parents, carers and children to help them understand what actions they can take to safeguard children’s privacy;
- We will engage with organisations to drive improvements in how social media and video sharing platforms protect children’s privacy in our priority areas.
- We will work with partners and stakeholders with relevant expertise to inform our approach.
- We will identify areas where organisations need greater certainty and provide additional guidance and advice where needed.
-
Supervision and enforcement:
- We will focus on the most serious risks to children’s privacy rights on social media and video sharing platforms and work to reduce them.
- We will use our regulatory enforcement powers to bring about compliance where necessary, such as by using fines or enforcement notices to require organisations to stop processing people’s information.
Many organisations must collaborate to regulate the internet effectively. Alongside this work on the Children’s code strategy, we will continue to work closely with Ofcom in its capacity as the UK’s online safety regulator and with international regulators.
We will also undertake audits of the development, provision and use of Ed Tech solutions in schools to understand the privacy risks and potential non-compliance with data protection legislation. We will publish our findings once we have completed this work.
We will provide further updates on progress and how we are measuring success.
1 Page 11 - Children and Parents: Media Use and Attitudes 2023
2 Slide 4 - Children and parents: media use and attitudes report 2023 – interactive data - Ofcom