Improving online privacy for up to 11.7 million children
Our Children’s Code is making the internet a privacy-friendly safe space for young people.
Around 98% of young people (aged three to 17) have access to the internet at home. We’re making sure companies respect their privacy rights. As a direct result of our regulatory intervention:
- X stopped serving adverts to users under-18 and removed the ability for under-18s to opt in to geolocation sharing;
- Viber turned off personalised advertising for children, ensuring that children’s default advertising experience is not based on their behavioural data or profiles;
- BeReal, Sendit, Soda and X changed their geolocation processing practices, including removing location data information from profiles by default and restricting the use of precise location data information on posts; and
- Dailymotion has implemented new privacy and transparency measures that remind children not to share personal information.
We work with organisations to explain the law and show them how to improve their approaches. When companies don’t follow the law, we use our powers to persuade them to do so.
In 2023, we fined TikTok £12.7 million for misusing children’s data. We’ve also investigated Imgur’s and Reddit’s approaches to processing children’s personal information in the UK. In particular, we focused on each platform’s approach to age assurance and issued notices of intent to impose a fine on both of them.
We are monitoring platforms that have not yet introduced improvements to ensure they implement their planned changes. We are also pushing for further improvements in any practices that do not comply with the law or conform to the Children’s code. We are ready to open further investigations and progress to formal enforcement action if needed.
Our work to drive further improvements in children’s privacy will continue in 2026. This includes:
- reviewing and improving services’ approaches to age assurance to identify any under-13s on their platforms;
- using our regulatory sandbox to assess the development of a social media platform targeted at children, which will use age-assurance measures such as facial age estimation; and
- monitoring 10 popular mobile game platforms used by children in the UK and reviewing game design and user experience to assess:
- compliance with default privacy settings,
- default geolocation settings, and
- targeted advertising.