Investigations announced into how social media and video sharing platforms use UK children’s personal information
- Date 3 March 2025
- Type News
- We are investigating how TikTok uses 13–17-year-olds' personal information to make recommendations to them
- We also announce we are investigating how Reddit and Imgur assess the age of their child UK users
- Investigations are part of our wider interventions into how social media and video sharing platforms use children's data
We are today announcing three investigations looking into how TikTok, Reddit and Imgur protect the privacy of their child users in the UK.
Our investigation into TikTok is considering how the platform uses personal information of 13–17-year-olds in the UK to make recommendations to them and deliver suggested content to their feeds.
This is in light of growing concerns about social media and video sharing platforms using data generated by children's online activity in their recommender systems, which could lead to young people being served inappropriate or harmful content.
Our investigations into Imgur and Reddit are considering how the platforms use UK children’s personal information and their use of age assurance measures. Age assurance plays an important role in keeping children, and their personal information, safe online. There are tools or approaches that can help estimate or verify a child’s age, which then allow services to be tailored to their needs or access to be restricted.
The investigations are part of our efforts to ensure companies are designing digital services that protect children.
At this stage, we are investigating whether there have been any infringements of data protection legislation. If we find there is sufficient evidence that any of these companies have broken the law, we will put this to them and obtain their representations before reaching a final conclusion.
John Edwards, UK Information Commissioner said:
“We welcome the technology and innovation that companies like social media bring to the UK and want them to thrive in our economy. But this cannot be at the expense of children’s privacy.
“My message is simple. If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law.
“The responsibility to keep children safe online lies firmly at the door of the companies offering these services and my office is steadfast in its commitment to hold them to account.
“I also want to take this opportunity to assure children, parents and carers in the UK that we are working on their behalf to make the online world a safer place.
“In announcing these investigations, we are making it clear to the public what action we are currently taking to ensure children’s information rights are upheld. This is a priority area, and we will provide updates about any further action we decide to take”.
Our role in protecting children online
We have driven significant change in the way companies approach children’s online privacy since our Children’s code came into force in 2021.
In the past year, we have focused on driving further improvements in how social media and video sharing platforms protect children’s personal information online. As a direct result of our regulatory intervention:
- X has stopped serving adverts to users under 18; removed the ability for under 18s to opt in to geolocation sharing; improved the public transparency materials available for under 18s; and created a dedicated help centre for child users and parents.
- Sendit has stopped automatically including geolocation information in children’s profiles, while BeReal has stopped allowing children to post their precise location online. These changes can help keep children safer in the physical world.
- Dailymotion has implemented new privacy and transparency measures encouraging children not to share personal information.
- Viber has committed to turn off personalised advertising for children, ensuring that children’s default advertising experience is not based on their behavioural data or profiles.
We have also published a progress update on its Children’s code strategy, outlining our key interventions in this space, and including a comparison table of some of the privacy practices of 29 social media and video sharing platforms.
We will continue to push for further changes where platforms do not comply with the law or conform to the Children’s code.
We will be working closely with Ofcom, which has responsibility for enforcing the Online Safety Act, to ensure our efforts are coordinated.
Notes to editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
- The ICO can take action to address and change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
- The ICO’s Age Appropriate Design Code – or ‘Children’s code’ – is a data protection code of practice for online services, such as apps, online games, and web and social media sites, likely to be accessed by children. It translates the UK GDPR requirements into design standards for online services, helping businesses understand what is expected of them. The code aims not to protect children from the digital world but instead protect them within it by ensuring online services are better designed with children in mind.
- Recommender systems are algorithmic processes that use personal information and profiling to learn the preferences and interests of users to suggest or deliver content to them.
- The Children’s code requires profiling to be switched off by default, unless there is a valid reason for a different setting. Profiling can only be used if there are measures in place to protect the child from any harmful effects – such as being fed content that is detrimental to their health and wellbeing.
- Due to the ongoing nature of these investigations, the ICO will not be commenting further on the investigations. The ICO has not reached a view on whether there is sufficient evidence of an infringement of data protection law by TikTok, Reddit or Imgur for us to take action using our enforcement powers in line with our Regulatory Action Policy, for example a reprimand, enforcement notice or penalty notice. If we find there is sufficient evidence to support taking such action, we will consider any representations we receive before taking a final decision as to whether data protection law has been infringed and what action, if any, is appropriate”.
- To report a concern to the ICO telephone our helpline 0303 123 1113 or go to ico.org.uk/concerns.