The UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

Age assurance encompasses a range of techniques for estimating or verifying the ages of children and users, including:

  • self-declaration;
  • AI and biometric-based systems;
  • technical design measures;
  • tokenised age checking using third parties; and
  • hard identifiers like passports.

The Children’s code Age appropriate application standard outlines expectations for online services likely to be accessed by children using age assurance.

The links below give examples and information on how age assurance impacts children’s rights under the United Nations Convention on the Rights of the Child (UNCRC). We also offer code recommendations on how to positively support and mitigate risks to these rights in this context.

Article 2: Non-discrimination

Children's rights should be protected and respected without discrimination of any kind. Children have rights to equal and effective access to the digital environment.

Risks to this right include developing age assurance tools that unfairly restrict certain groups of child users. Organisations can support this right by ensuring due regard for their obligations under the Equality Act 2010, creating clear, accessible methods for users to have inaccurate age assessments rectified and by implementing measures to reduce bias in AI and human appeal. For further reference, see the ICO AI guidance.

Article 12: Respect for the views of the child

Children who are capable of forming their own views have rights to express them, in all matters that affect them.

Services can support this right by enabling children to have their views heard on an age appropriate platform. This right is at risk if services wrongly deny a child access to a service where they can express their views, due to incorrectly assessing their age.

Article 16: Protection of privacy

Children have a right to be protected from arbitrary or unlawful interference with their privacy.

Age assurance tools can risk this right if data gathered for age assurance is re-purposed for other aims. An example of this is if the service uses the child’s date of birth to target them with birthday promotions. The use of parental controls, such as account confirmation for the purposes of age assurance, pose risks to a child’s privacy. These should only be used where the child has authorised the parent to do this, the child doesn’t have sufficient understanding to exercise the rights themselves or it is evident that this is in the best interests of the child. Another risk is if the service gathers more data than is necessary to understand the child’s age.

Article 31: Access to leisure, play and culture

Children have a right to engage in play and recreational activities appropriate to their age. They should be free to participate in cultural life online and offline.

Automated age assurance tools can risk this right if they restrict a child’s access to online communities, games and services due to inaccurate data. Tools should give children and parents the right to human intervention to avoid this.

Article 33: Protection from drug abuse 

Children have a right to be protected from the illicit use of drugs and age-restricted substances.

Age assurance measures can support this right by ensuring children cannot access age-restricted services. This is particularly relevant where companies use their data to promote or purchase age-restricted products (for example alcohol promotion websites).

Article 34: Protection from sexual exploitation

Children have a right to be protected from all forms of sexual exploitation and abuse. This includes coercion into unlawful sexual activity or creation of sexual content.

Age assurance measures can pose risks to this right where a lack of appropriate measures allow children to access services where they can unlawfully access or create sexual content.

Children’s code recommendations on age assurance:

  • Establish the age of child users with a level of certainty that is appropriate for the risks that your data processing creates. If you can’t, apply the code’s standards to all users.
  • Consider the following to estimate or verify the age of your child users:
    • age self-declaration;
    • artificial intelligence;
    • third-party age verification services;
    • account holder confirmation;
    • technical measures; or
    • hard identifiers.
  • Introduce measures to ensure accuracy, avoid bias and explain the use of AI-based age assurance.
  • Clearly tell users how their data will be processed for age assurance and how they can challenge an inaccurate assessment of their age.
  • Do not repurpose data and user profiles developed for age assurance for other purposes. Only collect the minimum amount of data needed.
  • Apply standards of the code in a way that recognises the age of your child users. For example, provide privacy information that is appropriate to the self-declared age of the user, but give them the option to access versions written for different age groups as well.
  • If you rely on consent for any aspects of your online service, you need to get parental authorisation for children under 13.
    If you are using a third-party age assurance provider, you need to obtain assurances and do due diligence on third parties you are sharing data with. This is to ensure they are using data in ways that are in the children's best interests.
  • Ensure your use of age assurance does not discriminate against users and has due regard to the Equality Act 2010.