This sections explains how some data processing activities commonly practised by online services that can impact children’s rights:
- Age assurance
- Data sharing between users
- Data sharing with a third party organisation
- Geolocation tracking
- Online complaint and request tools
- Connected toys and devices
- Parental controls
- Profiling and service personalisation
- Profiling for automated decision-making
- Privacy information, policies and community standards
- Privacy and data use settings
Age assurance
What is it?
Age assurance encompasses a range of techniques for estimating or verifying the ages of children and users, including:
- self-declaration;
- AI and biometric-based systems;
- technical design measures;
- tokenised age checking using third parties; and
- hard identifiers like passports.
The Children’s code Age appropriate application standard and the Commissioner’s Opinion on the use of age assurance outline expectations for online services likely to be accessed by children using age assurance.
How could it impact children’s rights?
Article 2: Non-discrimination: Risks to this right include developing age assurance tools that unfairly restrict certain groups of child users. Organisations can support this right by ensuring due regard for their obligations under the Equality Act 2010, creating clear, accessible methods for users to have inaccurate age assessments rectified and by implementing measures to reduce bias in AI and human appeal. For further reference, see the ICO AI guidance.
Article 16: Protection of privacy: Age assurance tools can risk this right if data gathered for age assurance is re-purposed for other aims. An example of this is if the service uses the child’s date of birth to target them with birthday promotions. Another risk is if the service gathers more data than is necessary to understand the child’s age.
Article 31: Access to leisure, play and culture: Automated age assurance tools can risk this right if they restrict a child’s access to online communities, games and services due to inaccurate data. Tools should give children and parents the right to human intervention to avoid this.
Article 33: Protection from drug abuse: Age assurance measures can support this right by ensuring children cannot access age-restricted services. This is particularly relevant where companies use their data to promote or purchase age-restricted products (for example alcohol promotion websites).
Article 34: Protection from sexual exploitation: Age assurance measures can pose risks to this right where a lack of appropriate measures allow children to access services where they can unlawfully access or create sexual content.
Data sharing between users
What is it?
Data sharing between users concerns how other people, within and outside of a service, can see and download user profile information, service activity (such as social media posts or gameplay stats) and user interactions. The Children’s code Data sharing standard outlines expectations on data sharing for online services likely to be accessed by children.
How could it impact children’s rights?
Article 6: Life, survival and development: Data sharing with other users can risk this right where it exposes children to risks of physical or emotional harm (for example stalking, bullying and harassment). This could be through on-by-default settings or by not having adequate transparency and safeguards.
Article 8: Development and preservation of identity: This right is at risk when services share children’s data relating to identity with other service users. This could be through on-by-default settings, or by not having adequate transparency and safeguards. Services can support this right by giving children profile options that can protect identity characteristics (for example avatars in gaming or auto-generated user names).
Article 13: Freedom of expression: Services can support this right by providing off-by-default settings for children to share data that allows them to express themselves (for example participating in online debate or sharing self-made content). This right is at risk where exposure to abuse from other service users has a chilling effect on children's speech. This could be through on-by-default data sharing or by not having adequate transparency and safeguards.
Article 16: Protection of privacy: Services can support this right by using pro-privacy nudges when children are about to share data with other users. For example, explaining the privacy implications of sharing data, or highlighting associated privacy settings. This right is at risk when services share children’s data with other users with on-by-default settings, or by not having adequate transparency.
Article 19: Protection from violence, abuse and neglect: Data sharing with other users can risk this right where it exposes children to risks of violence or abuse (for example stalking and harassment). This could be through on-by-default settings, or by not having adequate transparency and safeguards.
Data sharing with a third-party organisation
What is it?
Data sharing with a third-party organisation concerns how service owners share users':
- personal characteristics;
- profiles;
- service behaviours (such as social media posts, searches or gameplay stats); and
- other personal data.
with organisations. This data could, for example, be shared:
- for commercial purposes;
- to fulfil a legal requirement;
- for research; or
- to safeguard children.
It may be done on an ad-hoc basis, or as part of an ongoing exchange. The Children’s code Data sharing standard outlines expectations for online services likely to be accessed by children that share children’s data with third parties
How could it impact children’s rights?
Article 6: Life, survival and development: Data sharing with third parties can support this right where it is for safeguarding purposes. It could also be to fulfil a regulatory requirement relating to children’s wellbeing and safety.
Article 8: Development and preservation of identity: This right is at risk where services share children’s identity data with third parties without a compelling reason to do so, in the best interests of the child.
Article 12: Respect for the views of the child: Services can support this right where privacy settings give children (and parents where appropriate) informed choices about how services share their data. This right is at risk where services do not provide privacy settings, or don’t meet the principles of the Transparency or Data Sharing standards.
Article 16: Protection of privacy: Services can support this right by using privacy-preserving technical measures (for example pseudonymisation or encryption) when sharing children’s personal data.
Article 19: Protection from violence, abuse and neglect: Data sharing with third parties can support this right where it is for safeguarding purposes. It could also be to fulfil a regulatory requirement relating to protecting children from abuse, neglect or violence (for example with social care bodies).
Article 32: Protection from economic exploitation: Data sharing with third parties can support this right where it is for safeguarding against economic exploitation (for example economic fraud and identity theft). Services can risk this right where it is for commercial gain and against the best interests of the child, or done without adequate transparency.
Article 34: Protection from sexual exploitation: Data sharing with third parties can support this right where it is for safeguarding purposes. It could also be to fulfil a regulatory requirement relating to prevention of sexual exploitation (for example with police).
Geolocation
What is it?
Geolocation tracking concerns how online services monitor and use children’s geolocation data. This is typically data taken from a user's device which indicates the geographical location of that device. It includes GPS data and data about connection with local Wi-Fi equipment or QR codes.
The Children’s code Geolocation standard outlines expectations for online services likely to be accessed by children that monitor and process children’s geolocation data.
How could it impact children’s rights?
Article 6: Life, survival and development: Geolocation tracking poses risks to this right, where sharing with other users - that is either on-by-default, not obvious to the child when in use or does not revert back to off after use – exposes children to risks of physical or emotional harm (for example through stalking, bullying or harassment). Geolocation tracking can support this right where services use it for safeguarding and parental controls (note that services must still follow the Children’s code Geolocation and Parental standards).
Article 15: Freedom of association: Geolocation tracking poses risks to this right where parental controls for tracking children’s movements are used without adequate transparency for the child.
Article 19: Protection from violence, abuse and neglect: Geolocation tracking poses risks to this right, where sharing with other users - that is either on-by-default, not obvious to the child when in use or does not revert back to off after use – exposes children to risks of violence or abuse (for example through stalking, bullying or harassment). Geolocation tracking can support this right where services use it for safeguarding and parental controls (note that services must still follow the Children’s code Geolocation and Parental standards).
Online complaint and requests tools
What is it?
Online tools, in the context of the Children’s code, are mechanisms to help children exercise their data rights simply and easily when they are online. The UK GDPR gives all people, including children, rights to determine if and how services use their personal data. Children have rights to access, correct, erase, restrict and object to the processing of their data. They also hold further rights relating to data portability (ie transferring data between similar services) and automated decision-making and profiling.
The Children’s code Online tools standard outlines expectations for online services likely to be accessed by children to provide transparent tools for children to exercise their full range of data rights.
How could it impact children’s rights?
Article 5: Parental guardianship and the evolving capacities of the child: This right is at risk where services either do not provide online tools for children (or parents on their behalf) to exercise their data rights, or if the tools are inaccessible or untransparent. Services support this right when they provide these online tools and they meet the requirements of the Online tools standard.
Article 8: Development and preservation of identity: Services support this right where they provide online tools that meet the code’s requirements for children (or parents on their behalf) to correct or erase data relating to their identity. This right is at risk where services either do not provide online tools, or if they are inaccessible or not specific to these data rights (for example where children can download a copy of their data but not correct it).
Article 12: Respect for the views of the child: This right is at risk where services do not provide online tools for children (or parents on their behalf) to exercise their data rights, or the tools are inaccessible. Services support this right when they provide these online tools and they meet the requirements of the Online tools standard.
Article 32: Protection from economic exploitation: This right is at risk where services do not provide accessible online tools for children (or parents on their behalf) to exercise their right to data portability. This is the ability to access and move their relevant data to a competing service. Services support this right when they provide this online tool and it meets the requirements of the Online tools standard.
Article 42: Knowledge of rights: Services support this right when they provide online tools for children (or parents on their behalf) to exercise their data rights and they meet the requirements of the Online tools standard. Pro-privacy nudges that encourage children to engage with these tools can also support this right.
Connected toys and devices
What is it?
Connected toys and devices are physical products supported by functionality that connects to the internet. Many of these devices process personal data to provide these functionalities. Examples include:
- interactive dolls or characters that communicate with children;
- voice-activated speakers; or
- smart home devices.
The Children’s code Connected toys and devices standard outlines expectations for developers of connected toys and devices likely to be accessed by children.
How could it impact children’s rights?
Article 16: Protection of privacy: This right is risked where devices gather data in private spaces such as a child’s home, without adequate transparency or safeguards (for example speakers in passive listening mode). Connected devices also risk privacy rights where they share data gathered with others in the connected device network, without adequate transparency or safeguards.
Article 24: Access to health and health services: Connected toys and devices risk this right where fitness trackers and connected health devices recommend personalised negative health behaviours (for example age-inappropriate exercise or diets).
Parental controls
What is it?
Parental controls are tools which allow parents or guardians to place limits on a child’s online activity and thereby mitigate the risks that the child might be exposed to. Examples include:
- setting time limits or bedtimes;
- restricting internet access to pre-approved sites only;
- restricting in-app purchases; and
- monitoring a child’s online activity or tracking their physical location.
The Children’s code parental controls standard outlines considerations for ISS using parental controls, including recommendations for what transparency information to provide for children, depending on their age.
How could it impact children’s rights?
Article 5: Parental guardianship and the evolving capacities of the child: This right is at risk where services provide parental controls that are untransparent (which may lead to a breakdown of trust between the child and parent), or they are used against the wishes of the child when they are at an appropriate age to object.
Article 13: Freedom of expression: Services can risk this right where they use parental controls to disproportionately or without transparency monitor children on relevant services, inhibiting their ability to express themselves freely.
Article 15: Freedom of association: Parental controls pose risks to this right where they are used for monitoring and tracking a child’s engagement with online communities without adequate transparency for the child.
Article 16: Protection of privacy: The use of parental controls pose risks to a child’s privacy, where they are used without adequate transparency for the child, or gather more data than is needed – for example if a service for monitoring a child’s journey collects data that is not relevant to their location or movement.
Profiling for service personalisation
What is it?
Profiling refers to any form of automated processing of personal data that assesses or predicts people’s behaviour, interests or characteristics. In the context of service personalisation, services use these profiles to suggest content and service features that align with a user’s profile. They also promote different service user experiences and features in line with the user’s interests, capabilities and needs.
The Children’s code Profiling standard outlines expectations for online services likely to be accessed by children that profile child users for content and service personalisation.
How could it impact children’s rights?
Article 6: Life, survival and development: Profiling for content delivery supports this right where the content promotes positive health behaviours or online safety tools. This right is at risk where profiling for content delivery exposes children to damaging content (for example age-inappropriate products, suicide and self-harm content or inaccurate health information).
Article 17: Access to news media and information: Services can support this right by profiling and personalised targeting of news information in the best interests of the child. This right is at risk where this profiling exposes children to information not in the best interests of the child (for example misinformation). It is also at risk when the information is against regulatory standards (for example Ofcom’s broadcasting code).
Article 19: Protection from violence, abuse and neglect: This right is at risk where profiling and personalisation exposes children to content or other users that are violent or abusive.
Article 24: Access to health and health services: Services can support this right where profiling and content personalisation promotes public health messaging and advice. This right is at risk where this content personalisation exposes children to inaccurate health information.
Article 32: Protection from economic exploitation: This right is at risk where services use profiling to target adverts and service features that generate revenue at children (for example loot boxes or in-game purchases). Profiling and personalisation can also pose risks to this right through targeted advertising of fraudulent or misrepresented products. Services can support this right by using profiling and content personalisation to target content that promotes financial literacy.
Article 33: Protection from drug abuse: This right is at risk where services use profiling to target age-restricted products to children (for example alcohol). Services can support this right by personalised targeting of information that protects children from drug abuse.
Profiling for automated decision making
What is it?
Profiling refers to any form of automated processing of personal data that assesses or predicts people’s behaviour, interests or characteristics. In the context of automated decision-making, services may use these profiles to:
- allow or restrict access to services (for example for age assurance); or
- enforce policies and community standards (for example automated moderation of content).
The Children’s code Profiling standard outlines expectations for online services likely to be accessed by children that profile child users for automated decision-making.
How could it impact children’s rights?
Article 13: Freedom of expression: Profiling for automated decision-making can risk this right where services use automated moderation of children’s online speech against their community standards or without adequate transparency and safeguards.
Article 16: Protection of privacy: This right is at risk by profiling that infers children's personal information, is set to on-by-default, or is used without adequate transparency or safeguards.
Privacy information, policies and community standards
What is it?
Services use privacy information, policies and community standards to:
- set user expectations for how services use their data;
- highlight the choices users have in regards to privacy; and
- outline wider content, user behaviour and age restriction policies.
The Children’s code Transparency and Policies and community standard standards outline expectations for the design of the privacy information, policies and community standards of online services likely to be accessed by children.
How could it impact children’s rights?
Article 5: Parental guardianship and the evolving capacities of the child: Services can support this right by providing different versions of privacy information. This allows children (and parents where appropriate) to seek more or less detail, as their capacities and data literacy develop.
Article 12: Respect for the views of the child: Services can support this right by consulting with children and parents when developing privacy policies and community standards. They can also support by providing resources for children under 13 (the minimum age of consent) to discuss their privacy choices with parents.
Article 13: Freedom of expression: This right is at risk where a failure to uphold service user behaviour policies exposes children to abuse. This may have a chilling effect on their freedom of speech and expression.
Article 16: Protection of privacy: This right is at risk where services share children's personal data with other users or third parties against service policies, or without meeting the code’s Transparency standard principles.
Article 23: Children with disabilities: This right is at risk where privacy information and community standards are provided in formats not accessible to children with disabilities, and where UK equalities law is not followed.
Article 32: Protection from economic exploitation: This right is at risk where services do not communicate to children and parents the policies and community standards that relate to how they generate revenue from users’ data. It is also at risk if the service does not meet the code’s Transparency standard principles.
Article 36: Protection from other forms of exploitation: This right is at risk where privacy information, policies and community standards are inaccurate or misleading.
Article 42: Knowledge of rights: Services can support this right where policies and community standards give children and parents information on the rights they hold under the Children’s code.
Privacy and data use settings
What is it?
Settings for privacy and data give users choice over how and when a service gathers, uses and shares their data with other users and third parties. These settings can relate to ongoing profile preferences or relate to a one-off use.
The Children’s code Default settings standard outline expectations for the design of online services likely to be accessed by children.
How could it impact children’s rights?
Article 6: Life, survival and development: This right is at risk where services set data sharing with other service users to on by default. This may expose children to risk of physical or emotional harm (for example through stalking, bullying or harassment).
Article 8: Development and preservation of identity: This right is at risk where services share children’s identity data with other users or third parties using on-by-default settings.
Article 12: Respect for the views of the child: Services can support this right where privacy settings give children (and parents where appropriate) informed choices about how services use their data. This right is at risk where services do not provide privacy settings, or don’t meet the principles of the Transparency standard.
Article 15: Freedom of association: Services can support this right by providing off-by-default and transparent privacy settings that enable children to interact with each other online.
Article 16: Protection of privacy: This right is at risk where services share children’s data with other users or third parties using on-by-default settings.
Article 19: Protection from violence, abuse and neglect: This right is at risk where on-by-default data sharing with other service users exposes children to risks of violence or abuse (for example through stalking or harassment).