This code came into force on 2 September 2020, with a 12 month transition period. Organisations should conform by 2 September 2021.
Switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
What do you mean by ‘profiling’?
Profiling is defined in the GDPR:
“any form of automated processing of personal data consisting of the use of personal data to evaluate certain aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour location or movements”
Profiling can be used for a wide range of purposes. It can be used extensively in an online context to suggest or serve content to users, to determine where, when and how frequently that content should be served, to encourage users towards particular behaviours, or to identify users as belonging to particular groups. It can also be used to help establish or estimate the age of a user (as detailed in the standard on age appropriate application), or for child protection, countering terrorism, or the prevention of crime.
Profiles are usually based on a user’s past online activity or browsing history. They can be created using directly collected personal data or by drawing inferences (eg preferences or characteristics inferred from associations with other users or past online choices).
Content feeds based on profiling can include advertising content, content provided by other websites, downloads, content generated by other internet users, written, audio or visual content. Profiling may also be used to suggest other users to ‘connect with’ or ‘follow’.
Why is it important?
Profiling is mentioned in Recital 38 to the GDPR as an area in which children merit specific protection with regard to the use of their personal data.
There are also specific rules at Article 22 of the GDPR about decisions (including profiling) which are based solely on the automated processing of personal data, and which have a legal or similarly significant effect on the data subject.
“22(1) The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly affects him or her”
Recital 71 to the GDPR states that such decisions ‘should not concern a child’.
The lawfulness, fairness and transparency principle at Article 5(1) is also relevant because this is an area of largely ‘invisible processing’ in which it is difficult for children to understand how their personal data is being used, and what the consequences of that use might be.
“5(1) Personal data shall be
(a) processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’)”
Some profiling may be relatively benign, for example personalisation of a ‘walled garden’ online environment to incorporate an animal theme in the displayed content. Other profiling, such as content feeds which gradually take the child away from their original area of interest into other less suitable content, raise much more significant concerns.
Should all profiling be controlled by a privacy setting?
It is important to remember that ‘off by default’ does not mean that profiling is not possible or banned. Following the safeguards and steps set out in this section, which could include effective consent, can enable profiling using children’s data to take place, safely and fairly.
There is no point in offering a privacy setting if the profiling is essential to the provision of the core service that the child has requested. This is because if the profiling were turned off there would be no residual service left for the child to use. This concept should be interpreted narrowly, eg that it is completely intrinsic to the service.
However, whenever you can, you should offer children control over whether and how their personal data is used. So most profiling should be subject to a privacy setting. If you can provide a core or residual service without profiling, then you should provide a privacy setting for any additional aspects of your service which rely on profiling.
You should always provide a privacy setting for behavioural advertising which is used to fund a service, but is not part of the core service that the child wishes to access. Although there may be some limited examples of services where behavioural advertising is part of the core service (eg a voucher or ‘money off’ service), we think these will be exceptional. In most cases the funding model will be distinct from the core service and so should be subject to a privacy setting that is ‘off’ by default.
There may also be some other limited circumstances in which it won’t be appropriate for you to offer a privacy setting over profiling. For example, if you are profiling in order to meet a legal or regulatory requirement (such as a safeguarding or child protection requirement), to prevent child sexual exploitation or abuse online or to age assure so you can properly apply the provisions of this code to child users.
How does this fit with PECR requirements?
A cookie is a small text file that is downloaded onto ‘terminal equipment’ (eg a computer or smartphone) when the user accesses a website. It allows the website to recognise that user’s device and store some information about the user’s preferences or past actions.
Profiling and non-essential cookies
If the cookie isn’t essential to provide the service that the child wants to access, then the underlying profiling it facilitates normally needs to be subject to a privacy setting. This gives the child control over whether their personal data is used for this purpose.
You need consent for the cookie as well as a GDPR lawful basis for processing for the underlying processing (in practice this may also be consent).
Cookies, profiling, and your core services
If the cookie is essential to the provision of your core service then it is likely that the underlying profiling that the cookie enables is too. In this circumstance providing a privacy setting which allows the child to control whether their personal data is used for this purpose won’t be appropriate. You need a lawful basis (other than consent) for the underlying processing (profiling) and won’t need consent for the cookie.
Cookies, profiling and your non-core services
Cookies may also be essential for providing your non-core services. However, as these are optional elements of your service you firstly need to provide a privacy setting which gives the child control over whether they wish their personal data to be processed in order to access them.
If the child decides to do so, then you do not need consent for the use of the cookie – as the child is specifically requesting to access part of your service and the cookie is strictly necessary for this purpose.
You do however need a lawful basis for the underlying processing.
Cookies, profiling, and age estimation or age assurance
In this circumstance, the purpose you use the cookies for is regarded as essential for the service, as you need to do so to provide an age appropriate service and comply with the GDPR. Provided that the cookie in question is solely used for this purpose, and not for any other purpose, then the child does not need to consent to the cookie.
For more information about cookies, and when a cookie is essential and non-essential, see our guidance on Cookies and similar technologies.
How can we make sure that we meet this standard?
Differentiate between different types of profiling for different purposes
Because profiling can be used to serve a wide range of purposes it is particularly important to be clear about the purposes for which your service uses personal data to profile its users, and to differentiate between them. Catch-all purposes, such as ‘providing a personalised service’ are not specific enough.
Where it is appropriate to offer privacy settings then you should offer separate settings for each different type of profiling. It is not acceptable to bundle different types of profiling together under one privacy setting, or to bundle in profiling with processing for other purposes.
Ensure features that rely on profiling are switched off by default (unless there is a compelling reason to do otherwise)
You need to switch any options within your service which rely on profiling off by default, unless you can demonstrate a compelling reason why this should not be the case, taking account of the best interests of the child. You need to assess this in the specific circumstances of your processing.
In practice it is likely to mean that any non-essential features that rely on profiling and that you provide for commercial purposes are subject to a privacy setting which is switched off by default.
In the case of any profiling you do for the purposes of behavioural advertising, which is facilitated by cookies, this approach is supported by the comments of the EDPB. EDPB have indicated that ‘legitimate interests’ is unlikely to provide a valid lawful basis for processing for this purpose which means that consent is your only viable basis for processing. As valid consent has to be ‘opt in’, allowing such profiling ‘by default’ is not an option. You also need to comply with the Article 8 GDPR requirements for parental consent if the child is under the age of 13. For more information about lawful bases for processing and Article 8 requirements see Annex C.
However, you may have a compelling argument that you need to switch profiling options for other purposes on by default.
For example, it may be appropriate for profiling for the purposes of ensuring that a service is accessible to a disabled child (eg identifying that a child has an ongoing need for a subtitled, signed or other supported service) to be switched on by default.
You may be able to demonstrate that profiling for the purposes of informing news content feeds should be allowed by default, in order to recognise the rights of children to access information. Although you still need consent to set the cookies that support the profiling in accordance with PECR requirements. This is more likely to be the case if you can demonstrate that you conform with existing regulatory codes of practice which govern media content and practices (such as The Editors’ Code of Practice) and have editorial control over the content that children will be shown as a result of the profiling. It is unlikely to apply if you do not have such editorial control or adhere to other regulatory controls. See also our FAQs for the news media.
Provide appropriate interventions at the point at which any profiling is activated
At the point any profiling options are turned on, you need to provide age appropriate information about what will happen to the child’s personal data and any risks inherent in that processing.
You should also provide age appropriate prompts to seek assistance from an adult and not to activate the profiling if they are uncertain or don’t understand.
Depending on your assessment of risk and the age of the child you may wish to make further interventions, which might include further age assurance measures.
If profiling is on ensure that you put appropriate measures in place to safeguard the child (in particular from inappropriate content)
If your online service uses any profiling then you need to take appropriate steps to make sure that this does not result in harm to the child.
In practice this means that if you profile children (using their personal data) in order to suggest content to them, then you need suitable measures in place to make sure that children aren’t served content which is detrimental to their physical or mental health or wellbeing, taking into account their age. As covered in the section of this code on DPIAs, testing your algorithms should assist you in assessing the effectiveness of your measures.
Such measures could include contextual tagging, robust reporting procedures, and elements of human moderation. It could also include your own editorial controls over the content you display, including adherence to codes of conduct or other regulatory provisions (such as compliance with The Editors’ Code of Practice, or the Ofcom Broadcasting Code). We recognise the importance of the rights of children to access information from the media, and the societal and developmental benefits of children being able to engage in current affairs and the world around them. We would therefore accept that adherence to editorial or broadcasting codes of conduct negate the need for providers of online news to take any additional steps in relation to news content for children. See also our FAQs for the news media.
If you are using children’s personal data to automatically recommend content to them based on their past usage/browsing history then you have a responsibility for the recommendations you make. This applies even if the content itself is user generated. In data protection terms, you have a greater responsibility in this situation than if the child were to pro-actively search out such content themselves. This is because it is your processing of the personal data that serves the content to the child. Data protection law doesn’t make you responsible for third party content but it does make you responsible for the content you serve to children who use your service, based on your use of their personal data.
Your general approach should be that if the content you promote or the behaviours your features encourage are obviously detrimental, or are recognised as harmful to the child, in one context (eg marketing rules, film classification, advice from official Government sources such as Chief Medical Officers’ advice, PEGI ratings) then you should assume that the same type of content or behaviour is harmful in other contexts as well. Where evidence is inconclusive you should apply the same precautionary principle.
Content or behaviours that may be detrimental to children’s health and wellbeing (taking into account their age) include:
- advertising or marketing content that is contrary to CAP guidelines on marketing to children;
- film or on-demand television content that is classified as unsuitable for the age group concerned;
- music content that is labelled as parental advisory or explicit;
- pornography or other adult or violent content;
- user generated content (content that is posted by other internet users) that is obviously detrimental to children’s wellbeing or is formally recognised as such (eg pro-suicide, pro-self harm, pro-anorexia content. Content depicting or advocating risky or dangerous behaviour by children); and
- strategies used to extend user engagement, such as timed notifications that respond to inactivity.
Ultimately, if you believe that it is not feasible for you to put suitable measures in place, then you are not be able to profile children for the purposes of recommending online content. In this circumstance you need to make sure that children cannot change any privacy settings which allow this type of profiling.
Similarly, if you cannot put suitable measures in place to safeguard children from harms arising from profiling for other purposes (such as profiling to promote certain behaviours), you should not profile children for these purposes either.
How does this fit with other rules on restricting access to content for children?
You may need to take account of other rules on restricting access to content in order to ensure that you don’t use children’s personal data in ways that have been shown to be detrimental to their wellbeing (for more detail see the standard on detrimental use of data).
The CAP code requires that when advertising is targeted through the use of personal data, advertisers must show that they have taken reasonable steps to reduce the likelihood of those who are, or are likely to be, in a protected age category being exposed to age-restricted marketing content.
The Ofcom On Demand Programme Service Rules require providers of ‘on demand’ content to only make certain content (‘specially restricted material’) available, if it can do so in a way that ensures that those under the age of 18 will not normally be able to see or hear it.
The Audiovisual Media Services Directive 2018 (AVMSD) (if implemented in the UK) will require ‘video sharing platform services’ to use proportionate measures in relation to how they organise the content they share, to protect minors from content which might impair their physical, mental or moral development.
We consider that it is consistent with these provisions to only allow children’s personal data to be used to determine content feeds if you can put suitable measures in place to guard against them being served content that is detrimental to their health and wellbeing
The AVMSD also requires that you should not use personal data collected or generated for the purposes of protecting minors from content which might impair their physical, mental or moral development for commercial purposes such as direct marketing, profiling and behaviourally targeted advertising.
We consider that this requirement is consistent with the purpose limitation principle of the GDPR and with our guidance in the sections of this code on age appropriate application - What if we need to collect personal data in order to establish age? It doesn’t mean that services within the scope of the AVMSD can’t ever process personal data for commercial purposes. It just means that you can’t use personal data collected for one purpose for another. If such services wish to profile children for the purpose of behavioural advertising you will need the child’s (or parent’s) consent. For more information on consent see Annex C Lawful bases for processing.
We will work with other regulators as necessary where issues of regulatory consistency arise.
Further reading outside this code
The Ofcom Broadcasting Code (with the Cross-promotion Code and the on Demand Programme Service Rules)
Directive (EU) 2018/1808 amending Directive 2010/13/EU (Audiovisual Media Services Directive) and the UK government's Audiovisual Media Services Consultation Document
Age Appropriate Design Code FAQs for the news media