The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

What does the ICO mean by ‘best interest of the child’?

The concept of the best interests of the child comes from Article 3 of the United Nations Convention on the Rights of the Child (UNCRC) which says that:

“In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration.”

This is important because it provides a framework to help you understand the needs of children and the rights that you should consider when designing online services.

Article 5(1)(a) of the GDPR says personal data shall be:

“processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency)”

And recital 38 to the GDPR says:

“Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing…”.

How can we make sure that we meet the ‘best interest of the child’ standard?

In order to implement this standard, you need to consider the needs of child users and work out how you can best support those needs in the design of your online service, when you process their personal data. In doing this you should consider the age of the user.

You should consider how, in your use of personal data, you can:

  • keep them safe from exploitation risks, including the risks of commercial or sexual exploitation and sexual abuse;
  • protect and support their health and wellbeing;
  • protect and support their physical, psychological and emotional development;
  • protect and support their need to develop their own views and identity;
  • protect and support their right to freedom of association and play;
  • support the needs of children with disabilities in line with your obligations under the relevant equality legislation for England, Scotland, Wales and Northern Ireland;
  • recognise the role of parents in protecting and promoting the best interests of the child and support them in this task; and
  • recognise the evolving capacity of the child to form their own view, and give due weight to that view.

Is age-gating a requirement of the code?

No. We do not want to see an age-gated internet, where visiting any digital service requires people to prove how old they are. What we want to see is a fundamental shift, where the internet and online services take a child-centred approach, recognising the needs of their users and building in appropriate privacy protection rather than bolting it on.

You need to think about the risks your data processing poses to children and either establish the age of your users with a level of certainty appropriate to those risks, or provide a high privacy service to all users by default. It isn’t a ‘one size fits all’ approach and establishing age with an appropriate level of certainty won’t always mean ‘proving’ age.

How will my company know how old our users are?

If you decide you would rather establish age than provide a high privacy service to all users by default, then you will need to decide how to go about establishing age. How far you will have to go in this respect will  depend on what you are doing with children’s data and what the impact might be.

The code suggests the following methods you should consider, depending on the risks associated with your data processing (higher risks will require a greater level of assurance):

  • Self-declaration.
  • Artificial intelligence.
  • Third-party age verification services.
  • Account holder confirmation.
  • Technical measures.
  • Hard identifiers.

Isn’t age assurance incompatible with the data minimisation principle though?

No. The data minimisation principle doesn’t stop you collecting personal data, if you need to do so to establish the age of your users with a level of certainty appropriate to the risks that arise from your processing.

The GDPR requires you to only collect the data that you actually need in order to age assure (data minimisation), to tell users that you are using their data for this purpose, and to not ‘re-purpose’ information collected for age assurance and use it for something else.

You won’t need user consent to collect personal data for the purposes of age assurance carried out in order to conform to the code (legitimate interests is likely to be the most appropriate lawful basis for this processing). Nor will you need consent to any cookies needed to facilitate such processing (as these will be considered ‘essential cookies’).

If parents help their children to circumvent age assurance measures, will my company be penalised?

It’s unlikely. We appreciate that no age assurance techniques are 100% reliable and that sometimes parents will help their children to circumvent any measures that are in place. What we will look at is whether the age assurance measures in place were sufficient, given the risks inherent in the data processing. In other words, whether you, as an ISS provider, are doing enough to establish the age of your users and ensure the fair processing of children’s personal data.

Does the code mean that I’ll need to have lots of different versions of privacy policies and other information on my website?

Not necessarily. There may be some scenarios in which providing one, simplified, accessible to all, set of information may work. The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child.

In many cases a one-size-fits-all approach does not recognise that children have different needs at different stages of their development. For example, a pre-literate or primary school child might need to be actively deterred from changing privacy settings without parental input, whereas a teenager might be better supported by clear and neutral information which helps them make their own informed decision.

Does the standard on ‘detrimental use of data’ mean the ICO is now going to police what content is recommended to young users of social media platforms?

This standard says that organisations can’t use personal data in ways that have been shown to be detrimental to children or that go against industry codes of practice. So, you will need to keep up to date with relevant advice and recommendations on children’s welfare in the digital context. Relevant advice and codes are likely to include marketing, broadcasting and gaming regulations.

Personal data often drives the content that children see and the ICO is responsible for regulating the use of personal data.

If you are using children’s personal data to automatically recommend content to them based on their past usage or browsing history then you have a responsibility for the recommendations you make. This applies even if the content itself is user generated. In data protection terms, you have a greater responsibility in this situation than if the child were to pro-actively search out such content themselves. This is because it is your processing of the personal data that serves the content to the child. Data protection law doesn’t make you responsible for third-party content, but it does make you responsible for the content you serve to children who use your service, based on your use of their personal data. This use of personal data is what the ICO will regulate.

This code is just part of the solution. We will continue to work with others in the UK and around the world to ensure that our code complements other measures being developed to address online harms.

Does this mean we can’t use features such as rewards, notifications and ‘likes’ within our service?

No. Not all such features rely on the use of personal data. You may have designed your feature to take into account the needs of children and in a way that makes it easy for them to disengage without feeling pressurised or disadvantaged if they do so. However, it does mean that you need to carefully consider the impact on children if you use their personal data to support such features. You should consider both intended and unintended consequences of the data use as part of your DPIA.

Designing in data-driven features which make it difficult for children to disengage with your service is likely to breach the Article 5(1)(a) fairness principle of the GDPR. For example, features which use personal data to exploit human susceptibility to reward, anticipatory and pleasure-seeking behaviours, or peer pressure.

You should:

  • avoid using personal data in a way that incentivises children to stay engaged, such as offering children personalised in-game advantages (based upon your use of the individual user’s personal data) in return for extended play;
  • present options to continue playing or otherwise engaging with your service neutrally without suggesting that children will lose out if they don’t;
  • avoid features which use personal data to automatically extend use instead of requiring children to make an active choice about whether they want to spend their time in this way (data-driven autoplay features); and
  • introduce mechanisms such as pause buttons which allow children to take a break at any time without losing their progress in a game, or provide age appropriate content to support conscious choices about taking breaks. 

The code requires operators to apply ‘privacy by default’ and ‘best interest of the child’ concepts to all users under 18. Up until now, operators thought about kids as under 13 (in the UK), or under 16 (in other EU countries), as per Article 8 of the GDPR.  What does this seemingly new requirement mean for companies engaging with teens online?

It is a common misconception that Article 8 of the GDPR defines children as being those under the age of 13 (UK) or 16 (elsewhere in the EU). This is not, and never has been, the case. Article 8 of the GDPR just sets the age at which ‘a child’ can provide their own consent to the processing of their personal data. Just because a child is old enough to provide their own consent, doesn’t mean they are no longer a child.

Our definition of a child being an individual under 18 comes from the United Nations Convention on the Rights of the Child. We had to have regard to the UK’s obligations under the UNCRC in drafting the code. It was also made clear in the parliamentary debates to the Data Protection Bill that, for the purposes of the code, children should include anyone under the age of 18. 

Organisations will need to ensure that they are properly considering the specific needs and vulnerabilities of children, including those between the ages of 13-17, if they are using their services.
 

What do we need to do to meet the ‘privacy by default’ standard?

Your default position for each individual privacy setting should be privacy-enhancing or ‘high privacy’

This will mean that children’s personal data is only visible or accessible to other users of the service to the extent that the child amends their settings to allow this.

This will also mean that, unless the setting is changed, your own use of the children’s personal data is limited to that which is essential to the provision of the service.

If a user does change their settings, you should generally give them the option to do so permanently or to return to the high privacy defaults when they end the current session. You should not ‘nudge’ them towards taking a lower privacy option.

My app relies on geolocation to provide its service. Will the code require me to turn it off?

If you need to process any geolocation data in order to provide your core service, it is not appropriate to have a privacy setting. However, you should offer children control over whether and how their personal data is used, whenever you can. However, any geolocation services that go over and above your core service should be subject to a privacy setting. For example, enhanced mapping services that make recommendations for places to visit based on location would need a privacy setting.

What changes relating to geolocation are included in the code?

There are no changes to underlying GDPR requirements. However, the code makes it clear what the ICO will expect you to do in order to process children’s geolocation data fairly, and therefore comply with these underlying GDPR requirements. You should make sure that geolocation options are off by default (unless you can demonstrate that it is needed for the core purposes of your service).

You should make it obvious to the child that their location is being tracked.

You should revert settings which make the child’s location visible to others to ‘off’ after each use.

How is the code relevant to the toy industry? What do toy producers need to do to be ready?

The code applies to all business that provide online services and products either directly for children or that are likely to be accessed by children in the UK - including teens up to 18.

If is also relevant to providers of ‘connected toys’ (toys which are supported by functionality provided through an internet connection). Connected toy producers should start by carrying out or updating an existing DPIA, to assess the potential risks to children from processing their personal data. Particular areas of focus might include avoiding passive collection of personal data (where the child is unaware that the toy is ‘on’ and data is being collected) and communicating clearly to children without a screen-based interface.