The ICO exists to empower you through information.

Latest update - 2 August 2024

2 August 2024 - this progress update was published

Introduction

At the ICO, we are committed to ensuring that the internet is a privacy-friendly and safe space for children, where they are protected within, rather than from, the digital world.

Our Children's code explains how organisations can make sure their digital services safeguard children’s personal information and comply with the UK General Data Protection Regulation (UK GDPR).

It sets out 15 standards of age-appropriate design. These provide organisations with flexibility to develop services in their own way, taking a proportionate and risk-based approach. The overarching expectation is that, when processing children’s personal information, organisations consider the best interests of the child in all aspects of the design of their service. This includes how they can protect and support children’s development, health and wellbeing.

In April 2024, we published our Children’s code strategy. This outlined what we have achieved so far since the Children’s code came into force. This included the provision of further advice and guidance and applying regulatory scrutiny to 55 organisations. We also set out that our priority in 2024-25 is to drive further improvements in how social media platforms (SMPs) and video sharing platforms (VSPs) protect children’s personal information online. Our focuses are:

  • default privacy and geolocation settings;
  • profiling of children for targeted advertising;
  • use of children’s personal information in recommender systems; and
  • use of personal information of children under 13 years old.

We recognise that this is a rapidly evolving market and that, since the initial introduction of the Children’s code, there will have been changes in the platforms that children use and in the services that platforms offer.

Therefore, the first phase of our work under the strategy has included a review of a sample of 34 SMPs and VSPs. 1, 2 We created accounts using proxies for children of different ages, to replicate the sign-up processes that children would follow. 3 We observed key account settings and privacy information presented to child users, but did not interact with other users.

Alongside this, we reviewed a range of academic, regulatory, governmental and civil society sources.

The purpose of this update is to outline our progress with the strategy so far. We set out the high-level findings from our review of SMPs and VSPs and the initial action we are taking. 4 We will publish a further update in due course.

We have also published a call for evidence to inform our work in relation to recommender systems and use of personal information of children under 13 years old. This update will help you respond to the call for evidence and we advise you to read the update before responding.

Children’s profiles set to private by default

SMPs and VSPs often provide users with a public profile. Profiles can include personal information such as a user’s name, gender, pronouns, interests, and identifiable images.

While SMPs and VSPs, and the profiles they allow users to create, can play a positive role in providing spaces for children to learn and socialise online, there are potential risks when these profiles, and the personal information within them, are made public. For example, public profiles can:

  • facilitate bullying, leading to harms including emotional distress and stigmatisation;
  • result in children receiving unwanted attention or contact from strangers. 5 Children’s information could be seen by, or shared with, malicious actors, increasing the risk of abuse, grooming and radicalisation; and
  • result in a loss of control over personal data, particularly if it is not possible, or not straight forward, to change the privacy settings.

Relevant standards in the Children’s code

A range of standards in the Children’s code are relevant to the areas we are considering. Of particular relevance here is standard seven, which outlines that settings must be 'high privacy' by default unless organisations can demonstrate a compelling reason for a different default setting, taking into account the best interests of the child. 6

High privacy by default means that children's profiles should be private by default, so their information is only visible to other service users if the child amends their settings to allow this.

Our initial findings and actions

Of the SMPs and VSPs included in our review, most platforms set children’s profiles to private by default in conformance with the Children’s code.

However, the review found that some platforms make children’s profiles public by default. In a small number of cases, privacy only appeared possible if users agreed to pay a fee or to opt into a subscription service.

In addition, a small number of platforms appeared to enable children to receive friend or follow requests from strangers and to receive direct messages from strangers by default.

We are concerned that some platforms do not appear to set children’s profiles to private by default, particularly where they also allow contact from strangers by default. We have written to five platforms outlining our concerns and calling on them to change their practices to comply with the law and better protect children. We have put them on notice that, if they do not bring themselves into compliance or demonstrate a compelling reason for their current approach within eight weeks, we will conclude investigations with a view to formal enforcement action, where appropriate. We plan to name the organisations we have written to, both platforms that improve their practices and those in respect of which we will conclude investigations with a view to formal enforcement action where appropriate.

We are also engaging with a range of platforms that have taken steps to comply with UK GDPR and the Children's code but could be going further to protect children’s privacy online. We have outlined where we believe these platforms could be doing more to secure further improvements for children’s privacy.

Default geolocation settings

Geolocation data is data taken from a user’s device that indicates its geographical location, including GPS data or data about connection with local Wi-Fi equipment.

Sharing geolocation data with other users can be beneficial, for example, children may let their families and carers track their device, and use it to connect with friends nearby, organise meeting points and get involved with local networks and activities.

However, it also brings risks of physical, financial and psychological harm. For example:

  • Recent research has highlighted examples of children receiving unwanted communications from other users after posting images with geolocation data tagged, which caused them distress. 7
  • The display of geolocation data could also encourage the escalation of cyberbullying to offline bullying, especially if children forget or are not aware that their location is visible to others. 8

The level of granularity of geolocation data is an important factor in considering risks posed to children - the more precise the location displayed, the higher the potential risks.

Relevant standards in the Children’s code

Of particular relevance here is standard 10 of the Children’s code, which makes it clear that any geolocation privacy setting should be switched off by default unless there is a compelling reason to do otherwise, taking into account the best interests of the child. 9

In addition, standard 13 sets out that providers should not use nudge techniques to lead or encourage children to provide unnecessary personal data or to turn off privacy protections. 10

Our initial findings and actions

Our review of SMPs and VSPs found that that, while they typically switch precise geolocation settings off by default for children and do not share children’s locations automatically with other users, some appear to nudge children to switch geolocation settings on or encourage them to share their location with others through tagging or including location when posting content.

In addition, when making geolocation information public, some platforms appear to share more granular information than others, and it was not always clear from our review whether children can switch geolocation data sharing off if they decide they no longer wish to share such data.

We are writing to four platforms to clarify their practices and to challenge them as to how their approach conforms with the Children’s code. We are ready to escalate where necessary to ensure that practices are in line with the best interests of the child, including by opening an investigation with a view to potential enforcement action where appropriate. 11

Profiling children for targeted advertisements

SMPs and VSPs often gather large quantities of data about their users, including children, drawing on the posts and content they view, people they interact with, accounts they follow, and groups they join. 12 This data can be combined with personal information collected at account set-up stage or from third parties, and used for a wide range of purposes, including targeted advertising. 13

While the use of targeted advertising can give users a more personalised and relevant advertising experience, children may not be aware that their information is being gathered or understand how it is being used for advertising purposes.

Where platforms do not take sufficient measures to protect children’s personal information in relation to targeted advertisements, the potential harms may include:

  • a loss of autonomy and control by children over their personal information, which may result in unwarranted intrusion from third parties through push notifications and nudge techniques used to promote products;
  • harm to children’s wellbeing from seeing targeted advertising promoting negative health behaviours based on personal information, for example, anorexia, body dysmorphia, incorrect nutritional information or inappropriate lifestyle choices for children; and
  • financial harm where targeted advertisements encourage in-service purchases and additional app access without adequate protections for children.

Relevant standards in the Children’s code

Of particular relevance here is standard 12 of the Children’s code, which states that profiling should be switched off by default for children, unless the service can demonstrate a compelling reason for profiling to be on by default, taking into account the best interests of the child. 14

Our initial findings and actions

Our review of SMPs and VSPs found that some platforms appear to use very limited data points to tailor advertising for children, such as age and high-level location data such as the country to help ensure that advertising is age-appropriate and jurisdiction specific. A small number appear not to show advertisements to children.

However, on other platforms it is not always clear what personal information is being collected from children or how it is being used for targeted advertising. In addition, our review found indications that some SMPs and VSPs may be profiling children for targeted advertising in a way that is not in the best interests of the child, for example, through potentially excessive data collection and not giving children options to control advertising preferences. 15

We have started a programme of active engagement with SMPs and VSPs where we have identified potential concerns. We are verifying their approach to the use of children’s personal information for targeted advertising and setting out our expectations for changes to their data processing practices to bring them in line with the Children’s code and UK GDPR. We will closely scrutinise any compelling reasons given for use of data points beyond age and high-level location data to deliver tailored advertising to children. We will follow up to secure improvements and, where necessary, consider all regulatory options, including opening investigations, where services fail to comply with the law.

Use of children’s information in recommender systems

Recommender systems are algorithmic processes that use personal information and profiling to learn the preferences and interests of the user to suggest or deliver content to them. 16

They can take many forms and can use a range of data, including personal information provided directly by the user (or from third parties) and information from users’ interaction with content on the service. Recommender systems may also make recommendations based on content that a user’s online friends, followers or groups engage with.

Recommender systems can help users, including children, navigate the vast amount of information online to find content they wish to view or engage with. They can also remove or moderate inappropriate content that is served to children and help promote the visibility of positive content. 17

However, there are a number of potential harms that could occur, including:

  • physical or psychological harm from seeing harmful content because platforms have not built sufficient protections for children when recommending content based on personal information (such as interests and preferences). This could, for example, include circumstances in which pathways to extreme material such as misogynistic accounts are recommended to boys interested in fitness or self-improvement; 18
  • physical or psychological harm, for example, sleep disruption, poor attention or addiction, resulting from platforms using children’s personal information to feed children recommendations in a way that promotes sustained viewing; 19 and
  • a loss of autonomy where children are unable to make informed decisions about what information to share with a service because they do not understand how it would be used or are not given sufficient tools to control how it is used by recommender systems.

There are clear links here with online safety issues. As the data protection regulator, we focus on what children’s personal information is gathered, how it is processed and the potential harms that could arise as a result. Ofcom is the regulator responsible for regulating online safety. We will continue to partner closely with them to protect children online.

Relevant standards in the Children’s code

Of particular relevance here is standard 12 of the Children’s code, which sets out that profiling should only be used if there are measures in place to protect the child from any harmful effects, in particular being fed content that is detrimental to their health and wellbeing. 20

Standard 15 of the code also requires services to provide prominent and accessible tools to help children exercise their data protection rights and report concerns. Effective use of these tools helps children take control over the data they share with services and the content parameters used to deliver feeds. 21

Our initial findings and actions

Our review of SMPs and VSPs suggests that a range of personal information is used by recommender systems, including name, age, sex, location, interests and online activity. 22 In their privacy notices, companies often refer to this process as how they ‘personalise’ services.

Our review provided limited evidence on the ways in which children’s personal information is processed. Privacy notices are often unclear about the specifics of how these platforms use personal information to make recommendations, or what measures they take to protect children’s privacy when doing so. While our review did not seek to assess the content shown to children, research has found that platforms using recommender systems may show children inappropriate and harmful content. 23

To further build our understanding of the data processing elements of this issue, we are inviting further evidence from stakeholders on the use of children’s personal information in SMPs’ and VSPs’ recommender systems. The call for evidence will be open until 11 October 2024.

In addition, we will engage directly with SMPs and VSPs to better understand:

  • the specifics of how their recommender systems use children’s personal information; and
  • what steps platforms take to mitigate the risk of harm to children from using that personal information in their recommender systems.

We will use this information and evidence to consider next steps.

Use of information of children under 13 years old

Data protection law and the Children’s code seek to protect the personal information of children of all ages within the digital world and to ensure an age-appropriate experience.

Younger children are likely to need greater protection than older children because they are at an earlier stage of development. This is reflected in the UK GDPR, which states that verified parental consent must be obtained for children under the age of 13 where consent is relied on as the lawful basis for processing personal information. 24

If platforms are unclear about or do not know the age of their users, they risk processing children’s information inappropriately and unlawfully as if they were an older child or an adult. 25 This could lead to a range of data protection harms, including:

  • loss of control of personal information resulting from under 13-year-olds creating accounts on services designed for older children with limited awareness of the risks as to how their information may be used or of their information rights;
  • psychological harms from children accessing content they are too young to view, as a result of the platform processing inaccurate personal information; and
  • financial harms from engaging with in-app purchases or subscriptions without adequate parental oversight.

Relevant standards in the Children’s code

Standard three of the Children’s code explains that online services should establish the age of their users with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from their data processing, or otherwise implement protections appropriate to children to all their users. 26

We have published an opinion on age assurance (the Information Commissioner‘s Opinion on Age Assurance) 27 that sets out how we expect businesses to meet the code’s age-appropriate application standard, and our expectations for age assurance data protection compliance.

Our initial findings and actions

Our review of SMPs and VSPs found that almost all platforms specified a minimum age of 13 in their Terms of Service. Many also appeared to rely on consent for at least some of their data processing activities. These platforms should have effective age assurance processes in place to ensure that they are not processing the personal information of under 13s.

While the majority of the platforms that we reviewed did use some form of age assurance at account set-up stage, most relied on users’ self-declaration of their age. 28 As set out in the Information Commissioner‘s Opinion on Age Assurance, self-declaration is unlikely to be appropriate when using children’s information in a way that raises high data processing risks. For example, where platforms do:

  • Large scale profiling
  • Invisible processing
  • Tracking
  • Or target children for marketing purposes or to offer services directly to them. 29

These platforms should adopt an age assurance method with an appropriate level of technical accuracy, and one that operates in a fair way to users. Examples of alternatives to self-declaration approaches are explained in the Information Commissioner’s Opinion on Age Assurance. 30

We also found that a small number of SMPs and VSPs in our sample do not appear to use age assurance at account set-up stage (it is possible that some subsequently use a form of profiling for age assurance purposes, but this was not clear from our review). Our initial priority in this area is to address this issue. We are writing to four platforms to clarify the lawful bases they rely on to process children’s personal information and their approach to age assurance. We will call on them to demonstrate a compelling reason for their current approach, taking into account the best interests of the child, and decide on our next steps in light of their responses.

In addition, we are considering the practices of platforms that rely on consent as their lawful basis for processing the information of users that are not logged in, particularly those under 13 years old who would require verified parental consent.

We are also inviting evidence from stakeholders on developments in the use of age assurance by SMPs and VSPs to identify children under 13 years old. This will inform our further work to support compliance with data protection law and conformance with the Children’s code. The call for evidence will be open until 11 October 2024.

We will continue to work closely with Ofcom on age assurance to help ensure an aligned approach across the data protection and online safety regimes.

Annex

The 34 SMPs and VSPs included in our review were selected primarily based on:

  • UK app download figures;
  • Ofcom research findings on the platforms that children use; and
  • whether their terms of service allow under 18s to view content and set up accounts. 31

Account creation requirements meant that we were only able to complete all our observations for 30 of these platforms. The review took place between March and May 2024, with follow up testing on platforms undertaken as necessary.

The platforms included were as follows:

  • BeFriend (formally Swipr)
  • BeReal
  • Bitchute
  • Clubhouse
  • Daily Motion
  • Discord
  • Facebook
  • Flikr
  • Frog chat (Novum)
  • Fruitlab
  • Hive Social
  • Hoop
  • Imgur
  • Instagram
  • Odysee
  • Pinterest
  • Reddit
  • Sendit
  • Snapchat
  • Soda
  • Threads
  • TikTok
  • Triller
  • Twitch
  • Vero
  • Viber/Rakuten
  • Vimeo
  • WeAre8
  • WeChat
  • Whisper
  • X (Twitter)
  • YouTube
  • YouTube Kids
  • Yubo

 

 1 See the annex for the platforms included in our review.

 2 The platforms included in our sample span a range of sizes and include platforms that started up more recently. Newer services may have lower awareness of the Children’s code.

3 This process was limited to initial sign-up and account set up and did not seek to recreate the exact behaviours, habits and interests of real children online.

 4 There is a range of actions we can take to promote conformance with the code and compliance with the law, from providing advice and guidance, to engagement, to opening investigations and taking enforcement action where appropriate. Going forwards, our work on children’s privacy will be increasingly focused on driving improvement via focused engagement alongside opening investigations into non-compliance where necessary.

 5 Recent research has, for example, found an increase in the proportion of girls reporting a stranger attempting to contact them, which in many cases made them feel uncomfortable. Teen girls see harassment as standard online.

 6 This standard is closely linked to article 25(2) of the UK GDPR which outlines that by default organisations should not collect any more personal data than needed for ”each specific purpose of the processing“ and should not make users' personal data visible to indefinite numbers of other users without the individual's intervention.

 7 5Rights | Pathways: How digital design puts children at risk 

 8 What can be done about cyberbullying in the UK? | National Centre for Social Research 

 9 This standard is linked to Article 25(2) UK GDPR which outlines that by default organisations should not collect any more personal data than needed for ”each specific purpose of the processing“ and should not make users' personal data visible to indefinite numbers of other users without the individual's intervention.

 10 This is based on Article 5(1)(a) of the GDPR which says that personal data shall be “processed lawfully, fairly, and in a transparent manner in relation to the data subject.”

 11 Two of these platforms also feature as platforms where we have concerns about default privacy settings. For these two platforms we have sent one letter in relation to both default privacy and geolocation settings. 

 12 See the Competition and Market Authority’s Digital Advertising Market Study for more information on the data gathered for targeting online advertising, including data volunteered by users, inferred data and observed data; Appendix F: the role of data in digital advertising.

 13 Profiling essentially refers to any form of automated processing of personal data that analyses aspects of an individual’s personality, behaviour, interests and habits to make predictions or decisions about them. See GDPR Article 4(4).

 14 There are specific rules outlined in Article 22(1) of the UK GDPR which relate to decisions (including profiling) which are based solely on the automated processing of personal data and Recital 71 to the GDPR states that such decisions should not concern a child. The lawfulness, fairness and transparency principle at article 5(1) is also relevant, as is the requirement to only process the minimum amount of data necessary for a given task.

 15 The ICO already has a separate programme of work to ensure that websites give users fair choices over whether or not to be tracked for personalised advertising.

 16 Our work on recommender systems is informed by a range of sources. These sources include The Coroner’s Service Prevention of Future Deaths Report for Molly Russell, the work of civil society groups such as 5 Rights and the Molly Rose Foundation, academic research, court papers in the public domain and the work of other regulators, nationally and internationally,

 17 Such as identity-affirming content or educational content. For the latter, see, for example,

 18 Social media algorithms amplify misogynistic content to teens

 19 See, for example,

 20 See footnote 14 regarding the data protection law underpinning standard 12 of the Children’s code.

 21 UK GDPR sets data rights for individuals in articles 15 – 22.

 22 This is based on a review of SMPs’ and VSPs’ privacy notices.

 23 See, for example,

 24 Article 6 of UK GDPR explains that data processing is lawful where one of six conditions is met, one of which is that an individual has consented to the processing Article 8 UK GDPR provides that where online services are offered to a child and consent is the lawful basis, the consent of the holder of parental responsibility over the child is required. For the purposes of this particular article, this refers to children under the age of 13.

 25 The ICO has issued a £12.7 m fine to TikTok for a number of breaches of data protection law, including that Personal data belonging to children under 13 was used without parental consent. This decision is currently under appeal.

 26 The data protection law which underpins the children’s code standards relevant to the processing of data of children under the age of 13 online includes Article 8 of the UK GDPR.

 27 Age assurance for the Children’s code 

 28 Self-declaration is where a user states their age but does not provide any evidence to confirm it; as such it can be circumvented by children, sometimes with the support of their parents (see Ofcom ICO Age assurance report)

 29 See When do we need to do a DPIA? for a full list of processing that the ICO considers likely to result in high risk.

30 3. Age assurance methods.

 31 We are looking at services which allow children on their platforms. Those SMPs and VSPs that provide adult-only services, but may be accessed by children, should focus on age assurance systems to prevent children accessing the service.