The ICO exists to empower you through information.

At a glance

  • The Children’s code applies to “information society services (ISS) likely to be accessed by children”.
  • You should decide whether children are likely to access your service, even if you run an adult-only service.
  • The code applies to services that are intended for use by children, and to services that are not aimed at children, but are accessed by a “significant number of children”.
  • If your service is not aimed at children, this guidance will help you assess whether children are likely to access your service in reality.
  • A “significant number of children” means that the number of children accessing or likely to access your service is material, so you should apply the code
  • The non-exhaustive list of factors could help you decide whether children are likely to access your service.
  • If a significant number of children are likely to access your service, you should conform with the standards of the code in a risk-based and proportionate way. If it would not be appropriate for children to access your service, your focus should be on preventing access.
  • Considering the non-exhaustive list of factors set out in this guidance when you make your assessment will help you comply with the accountability requirement.

Checklist

□ We have considered the list of factors to assess whether children are likely to access our service.

□ We have recorded our decision-making about whether children are likely to access our service, and will keep this decision under review.

□ Where children are likely to access our service, we either comply with the standards of the code, or apply age assurance measures appropriate to the data processing risks on our service.

□ We consider whether children are likely to access our service as part of the risk assessment when we conduct our DPIA.

Likely to be accessed by children - FAQs

As a provider of an Information Society Service (ISS), you should decide whether all or part of your service falls within scope of our Children’s code. The code applies to ISSs that are “likely to be accessed by children” in the UK.

This means that the code applies to services that are:

  • intended for use by children; and
  • not specifically aimed or targeted at children, but are nonetheless likely to be used by under 18s.

Therefore, even if you state in your terms of service that under 18s should not access your service, you still fall within scope of the code if children access your service in reality.

If you run an adult-only service, under 18s may still find your service appealing and access it, even if you are not directly targeting them.

The aim of this guidance is to help you to assess whether children are accessing your service in reality. If so, how to apply appropriate safeguards to protect them in a proportionate way. It is not an extension of the code to all services that children could theoretically access.

If you need to check whether your service is an ISS, please read our guidance on what services are covered by the code for further information.

What does the code say about when services are likely to be accessed by children?

Our code says:

“If the nature, content or presentation of your service makes you think that children will want to use it, then you should conform to the standards in this code. If you have an existing service and children form a substantive and identifiable user group, the ‘likely to be accessed by’ definition will apply. Given the breadth of application, the ICO recognises that it will be possible to conform to this code in a risk-based and proportionate manner.

If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.

If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.”

What is an adult-only service?

An adult-only service is any ISS that is targeted at those over the age of 18.

We are not sure whether children access our service, how do we decide if children are likely to access it?

You should assess whether there is evidence that the existing users of your service include children.

Our code refers to children forming a “substantive and identifiable user group”. You should therefore consider whether it is reasonable to conclude that under 18s form a material group of people that use the service (you do not need to find out the actual identity of people who are likely to be under 18). If so, then the code applies.

Our code refers to a “significant number of children”. This means that the number of children accessing or likely to access your service is material, so you should apply the code. “Significant” in this context does not mean that a large number of children must be using the service or that children form a substantial proportion of your users. It means that there are more than a de minimis or insignificant number of children using the service. This low threshold depends on a variety of factors relating to the type of service, how it has been designed and the personal data processing risks that it presents to children.

You therefore need to assess whether children are likely to access your service, and if so to take appropriate steps. We have set out below a non-exhaustive list of factors that you could take into account when carrying out your assessment of whether children are likely to access your service. You should consider the different features of your service, and whether children are likely to access all or parts of it. The list will help you make an informed decision by suggesting the evidence you could consider. How you approach the list will depend on your specific circumstances, but you must be able to justify your decision-making. You should conduct assessments about whether children are likely to access your service in a risk-based and proportionate way.

Examples of factors to consider Notes and any limitations
Actual evidence or information you have
Whether children can access your service. Do you have systems or processes in place to prevent children from accessing your service, for example robust age assurance measures. Self-declaration will not be sufficient to demonstrate that children cannot access your service.
The number of child users of your service, and the proportion of total UK users or total UK children that this represents. The number of UK child users may be considered significant in absolute terms or in relation to the proportion it represents of total UK users of the service or the number of children in the UK. You should use current UK population information to assess the latter. Sources of evidence may include any age information you have available, such as information gathered from any age profiling tools you may be using.

Any research evidence available, such as:

  • your own research about your users; or
  • any existing evidence of user behaviour.

Existing evidence of user behaviour may include:

  • peaks in access, for example after school and during the school holidays;
  • internal analytics;
  • patterns of communication or digital interaction;
  • business intelligence and market research, including information about a user or groups of users to estimate or infer the age, age range, or as a proxy.

Information on advertising targeted at children.

This includes whether advertisements on your service, including third party advertisements, are directed at or are likely to appeal to children. You may have information, including some provided to or by advertisers, such as number of clicks on ads that show an interest in child-focused advertising.

Information on complaints received about children accessing or using your service. Information you receive about complaints from parents, children or third parties about the age of people accessing your service.

Other evidence to consider

The types of content, design features and activities which are appealing to children.

The subject matter or nature of the content on your service, including any information that estimates, identifies or classifies that the content is likely to appeal to children. This includes if children are the intended audience, or likely part of the intended audience, for the content. For example, cartoons, animation, music or audio content, incentives for children’s participation, digital functionalities such as gamification, presence of children, influencers or celebrities popular with children.
Any other research evidence such as:
  • academic, independent and market research
  • research relating to similar providers of ISS; or
  • news stories and information in the media.
This includes research you may have commissioned yourselves, as well as publicly available research. You should determine the reliability of the research and justify your decision-making.
Whether children are known to like and access similar services. Evidence of children accessing services with similar content.
Your operating or business model. Information about your revenue streams and sources of turnover, as well as other information captured in management accounts or annual reports that suggests that children are an audience for your service, and that the service is likely used by a significant number of children.
How you market, describe and promote your service. For example, is any advertising targeted at children? Are there toys or other products associated with your services targeted at children?

Why has the ICO not set a numerical threshold for assessing what a ‘significant number’ of children is?

Numerical thresholds can be inflexible and do not take into account that all services are different. What is ‘significant’ may vary based on the ISS’s specific circumstances including:

  • the number of people using the service;
  • the number of the users who are likely to be children; and
  • the data processing risks the service poses to children.

The non-exhaustive list of factors will help you to make an assessment in the context of your own processing.

We operate an existing adult-only service. Should we still carry out an assessment of whether children are likely to access it? 

The code applies to all ISS that are likely to be accessed by children, regardless of whether it is a new service or an existing one.

You should preferably assess this at the outset when designing your service. It will help you decide what age assurance measure(s) to implement or whether you should apply the standards of the code to all users in a risk-based and proportionate way. The approach you take will depend on the data processing risks that your service presents to younger users. You could consider the different functionalities of your service, and their individual data processing risks. This will enable you to apply a different approach to areas of the service which present the greatest data processing risks, if that is more suitable for your service.

However, you should also assess existing services and you should keep your assessment under review so that you are aware of any changes to the age demographic of your user base. You should conform to standards of the code in a risk-based and proportionate way. Or review your access restrictions if it later emerges that a significant number of children are in fact accessing your service and it would not be appropriate for them to do so.

For further information about age assurance, please read our Commissioner’s Opinion on Age Assurance for the Children’s Code.

Our website has an age-gating page that undertakes age assurance and only allows those over the age of 18 to enter. If a child only accesses my age-gating page, are we within scope of the code?

If you have an age-gating page to prevent access to under 18s, it will not be within scope of the code if:

  • you use it to ensure that children are not accessing your adult site;
  • the measures are robust and effective and therefore prevent under 18s accessing the service; and
  • it is not an extension of your adult site (e.g. the age-gating page doesn’t allow access to parts of your adult site before age assurance occurs).

You must ensure that your age-gating page is compliant with data protection legislation. In particular, you must process this personal data transparently and fairly.

It is unlikely that a self-declaration age assurance method is an effective way to fully restrict access to underage users.

For further information about Age Assurance, please read our Commissioner’s Opinion on Age Assurance for the Children’s Code.

How do we demonstrate our decision-making about whether children are likely to access our service?

Carrying out an assessment using the list of factors set out above will help you meet your accountability requirements. Accountability obligations require you to take “into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons”.

The list of factors will help you to demonstrate how you decided whether children are accessing your service, and therefore any measures you need to take.

You must carry out a data protection impact assessment (DPIA) if you are offering online services to children. We have developed guidance to assist ISSs in carrying out a DPIA and Tools for completing a data protection impact assessment (DPIA). In carrying out a DPIA, you must take into account factors such as:

  • the types of information you collect;
  • the volume of information;
  • the intrusiveness of any profiling;
  • whether decision-making or other actions follow from profiling; and
  • whether you are sharing the information with third parties.

You should assess whether children are likely to access your service in a way that is proportionate to the risk that your service presents to children. If you do not adequately assess whether children are in fact likely to access your service, you risk non-compliance with your accountability requirements. The proposed non-exhaustive list of factors suggests evidence to support you in this decision-making.

If you decide that children are not likely to access your service, you should record this decision and keep it under review. You must be able to justify your decision-making. Demonstrating the steps you have taken to reach your decision is an important part of your accountability requirements.

The case studies below give examples from a range of ISS whose services are not aimed at children, using the non-exhaustive list of factors to decide whether children are likely to access their service in reality.

We run an adult-only service and having considered the list of factors, we have assessed that children are likely to access our service. What should we do now?

You will have two options, you should either:

  • apply the principles of the code to all users in a risk-based and proportionate way; or
  • apply appropriate age assurance measures to restrict access by under 18s so that they are no longer likely to access the service, if it would not be appropriate for them to do so. If you do this, the code will not apply to you.

Read our Age appropriate design: a code of practice for online services for more information. For further information about Age Assurance, please read our Commissioner’s Opinion on Age Assurance for the Children’s Code.

Would you take regulatory action against me, if I have interpreted the list of factors differently to the ICO when assessing whether children are likely to access my service?

There are a number of considerations when we decide to take regulatory action. We have set out how we enforce the code. We take into account the efforts made to conform to the provisions in the code when deciding what regulatory action will be appropriate. We consider whether an organisation has applied our guidance and can evidence how they have interpreted it for their organisation and processing activities. This regulatory approach also applies to an organisation’s assessment of whether children are likely to access their service. The list of factors provides a suggested, non-exhaustive list to help you assess whether children are likely to access your service.

Read our guidance on the Enforcement of this code and our Regulatory Action Policy for further information.

Further Reading

You should conduct assessments about whether children are likely to access your service in a risk-based and proportionate way. We have resources to help you assess the data processing risks that your site presents to children, for example:

Case studies

The case studies below are hypothetical examples of Information Society Services (ISS) not aimed at children assessing whether children access their service in reality. The figures shown are examples only and do not provide a benchmark for when ISS providers are within scope of the Children’s code. Because all services are different, you must make your own assessment based on the context and specific circumstances of your processing.

 

Online dating

Case study 1

An online dating site indicates in their terms of service that people must be 18 or over to use their site and asks them to declare this during registration. However, they are considering whether the service is likely to be accessed by children in reality. The provider will determine this by considering the ICO’s non-exhaustive list of factors and reviewing all available evidence and information they have on whether children use their site or similar ones. They decide that they need to carefully consider whether children are accessing their site because it was not designed with children in mind, and there are data processing risks for them inherent on the site.

Firstly, they consider whether the content on their site would be appealing to children. The service is most popular with young adults; research and media reports indicate that this is part of what makes a dating site appealing to children. They are also aware of news reports of underage people who have managed to access similar sites.

Next, they consider user numbers, which are around two million active UK users per month. They estimate the number of underage people within this figure using, amongst other internal sources, statistics for people who have had their accounts deleted for being underage and the number of children they prevent from joining at the registration stage. In addition, market research identifies over 50,000 active users per month who are aged 15 to 17.

Having considered multiple factors, they conclude that it is likely that children form a significant number of their user group. They record this as part of their decision-making to meet UK GDPR accountability requirements and under the risk assessment in their data protection impact assessment (DPIA). They decide that in this case, they will concentrate on ensuring that children are prevented from accessing their service, rather than seeking to apply the standards of the code. This is because the potentially harmful nature of the personal data processing does not conform to the code standards for underage users. The online dating site therefore applies robust age assurance methods that they have assessed are effective at preventing children from accessing the service.

Case study 2

The provider of an online dating site that markets itself to a mature audience is considering whether the site is likely to be accessed by children. The dating site will determine this by considering the ICO’s non-exhaustive list of factors and reviewing available evidence and information they have on whether children use their site or similar ones.

The service is aimed at people over 50 years old and has an overall user base of approximately 200,000 UK active monthly users. The provider reviews their user figures, including statistics for those who have had their accounts deleted for being under 18. These indicate that they have a very low number of underage visitors to the site, likely totalling under 20 per month. They immediately remove these underage users from the site and delete their accounts.

In addition, research and media reports about children accessing dating sites show that children are most attracted to dating sites with a young adult user base of those aged 18-25. Where advertising is used on the site, it is for services such as life insurance and pension providers that are targeted at an older demographic. Their design features are also aimed at a mature adult audience. The provider concludes that it is unlikely that their service will appeal to children.

They record their decision that it is unlikely that children form a significant user group on their dating site, and they do not fall within scope of the code in this case. However, they will keep this under review.

Pornography

Case study 1

A pornography site states in its terms of service that people must be 18 or over to use their site; it asks that people under this age do not access the site. However, the provider does not currently have any measures in place to prevent underage people from accessing the service. As the site is adult in nature, they had thought that they were out of scope of the code. However, they are re-evaluating this decision and considering whether it is likely that children will access their site in reality.

The ISS reviews the processing that occurs on the site. If children access the service, their personal data may be processed through third party cookie data sharing and profiling which is not off by default, which is not in the best interests of the child. When conducting their DPIA, they determine that their data processing is likely to result in a high risk to the rights and freedoms of children if they do access the site.

They therefore consider the ICO’s non-exhaustive list of factors thoroughly. They are aware that they are one of the most popular pornography sites in the UK, with over 12 million monthly users. Civil society research and news outlets indicate that pornography is often accessed either intentionally or inadvertently by children. They also do not have any measures which would effectively restrict children from accessing their service. They conclude that it is likely that a significant number of children access the site.

Having conducted this exercise, the provider concludes that the service falls within scope of the code if they do not prevent children accessing their site. Due to the potentially harmful nature of the data processing that does not conform to the code standards for underage users, the site will implement an age-gating page. The age-gating page will warn of adult content and apply robust age assurance measures to prevent underage people from accessing their site. They are aware that research indicates that a significant percentage of young people lie when using self-declaration based age assurance systems and therefore they opt for more robust measures.

They will also ensure the age-gating page complies with data protection and e-privacy legislation.

Case study 2

An online pornography video sharing service has a minimum age requirement of 18. They are complying with their UK GDPR accountability requirements by considering whether they fall within scope of the code.

They are aware that the service is most popular with a young adult audience that research shows is likely to make the service appealing to children. Amongst a range of measures, they have implemented robust age assurance methods through several third-party technological solutions. This is because there would be data processing risks to children if they access the site.

As a result of the age assurance measures they deploy, they are satisfied that they do not fall within scope of the code. Although their site may appeal to underage users, they have implemented robust measures that mean it is unlikely children will be able to access their service. Market surveys suggest that 8,000 children access the age-gating page, but are unable to access the site any further. They also confirm that they are complying with their data protection obligations, in particular transparency and cookie consents. They record this as their decision-making for being out of scope of the code and will keep this decision under review.

Games

Case study 1

A video game publisher is considering whether they are within scope of the code. Their terms of service state that the video game should not be accessed by those under the age of 18. The game uses self-declaration to prevent underage users from accessing their service. However, as part of their compliance with the accountability principle and their DPIA risk assessment, they assess whether it is likely that children will access the game in reality. To do this, they review the ICO’s non-exhaustive list of factors and available evidence and information they have on whether children access their service or similar services.

They are aware that the game is one of the top played games in the UK. Although the terms of service state that users must be 18 or over, the game includes design features which would appeal to children, such as cartoon animations. They assess that children are playing similar games, and they are aware that clips of teenagers playing the game are popular on video streaming sites. An analysis of their internal business information indicates that a significant number of children are accessing the game. For example, they have received a large number of complaints from parents about children engaging in app purchases. They conclude that children are likely to access their service. They record this as part of their decision-making to meet accountability requirements and as part of the risk assessment in their DPIA.

To conform with the code, and to mitigate risks associated with personal data processing, they need to take action. They should:

  • improve the age assurance systems at the point of account entry;
  • improve their design to move the higher risk processing behind a logged-in age assured gate; or
  • apply the principles of the code to all users.

The publisher decides to restrict high risk personal data processing from users who have not been effectively age assured. They redesign functions of the game which children are not prevented from accessing to ensure that the data processing meets the standards of the code. They will monitor whether these changes result in a reduction in the number of complaints they receive, and keep this decision under review.

Case study 2

An online game is considering whether children are likely to access their service, and if so whether they need to make any changes to their data processing. They are aware that they are most popular with those aged above 25 and below 40. They consider the risks of the service and determine that their processing activities would not present high risks to any child users if they access the game. For example, they do not use nudge techniques, and geolocation tracking is already compliant with the code. Whilst news stories have indicated that the game is being played by children in secondary schools, they have determined that they do not need to take any further action. This is because they are already compliant with the code in a way that is appropriate to the data risks that their service represents. They are aware that there is no requirement to design their service for primary school children as they aren’t likely to access the service. Their privacy notices are already accessible for teenagers. The ISS decide not to offer transparency information for younger audiences as there is no indication that primary school children are accessing their service. They record this decision including the evidence it is based on and will keep it under review should their processing activities change.

Social media

Case study 1

A social media service with over 300,000 monthly UK users indicates in its terms of service that users must be 18 or over to access their site. The service includes both a logged-in and a non-logged-in environment. Content can be viewed by anyone on the non-logged-in environment and users select their age to enter the logged-in environment. Self-declaration figures suggest that there are no under 18s accessing the logged-in service. However, the provider is considering whether it is likely that children access their service in reality. They do this by considering the ICO’s non-exhaustive list of factors and reviewing all available evidence and information they have on whether children use the site or similar ones.

They review their personal data processing and conclude that they would not be complying with the code if children are accessing the service. Furthermore, their data processing activities are likely to result in a high risk to the rights and freedoms of children.

They consider whether the content on the service would appeal to children. The service includes cartoon videos, imaged bright colours, emojis and the live streaming of users playing video games known to be popular with children. Civil society and academic research suggest that this kind of content appeals to children.

The site deploys artificial intelligence to detect children who may have inappropriately accessed the service. These statistics show that 1-3% of logged-in users are flagged as being underage. The provider therefore concludes that their service is likely to be accessed by a significant number of children.

They decide to change their business model to make certain aspects of their service suitable for users aged 13 or over. They therefore apply the standards of the code to all users in these parts of the service. The functions of their service which pose high data processing risks to children are moved behind an age assurance page. This robustly assesses whether the user is 18 or over and restricts access to those aged under 18. They record this as a risk mitigation in their DPIA.

SME and hobby sites   

Case study 1

A small business which provides services for parents to help them return to work after a career break is assessing whether they are likely to be accessed by children. They consider the list of factors, and cannot find any evidence to suggest that their service would be appealing to children. This is because the content, design features and activities on the site are not appealing to children. News reports show no evidence of children trying to access similar services, and they have no business information which indicates that children are trying to access their service. They therefore document their decision that they do not need to take any action at present. They will keep this decision under review if any of their processing activities change.

Case study 2

A sewing site which offers advice and design services and is not aimed at any particular age group considers whether their site is likely to be accessed by children. They consider the non-exhaustive list of factors and decide that whilst they are most popular with adults, their service may also be accessed by children. For example, children who are studying sewing at school. They determine that their processing does not present a high risk to children should they access the service. This is because they already apply the standards of the code in a risk-based and proportionate way, such as privacy settings set at high by default. They decide that they do not need to take any action. They will keep this decision under review if any of their processing activities change. They refer to the ICO guidance on the code for small businesses to ensure that their data processing is suitable for children.