Content moderation is a type of processing that is likely to result in a high risk to people’s rights and freedoms. This means you must carry out a data protection impact assessment (DPIA) prior to the processing because your content moderation is likely to involve:
- processing involving new technologies, or the novel application of existing technologies (including AI);
- combining, comparing or matching personal information obtained from multiple sources;
- solely automated processing that has a legal or similarly significant effect on the user; or
- decisions about a person’s access to a service based on automated decision-making or use of special category information.
You must carry out a DPIA if you are using children’s personal information as part of offering an online service directly to them.
You must include the following in your DPIA:
- describe the nature, scope, context and purposes of the processing. Be clear about what personal information you want to process and why;
- assess necessity, proportionality and compliance measures; and
- identify all relevant risks to people’s rights and freedoms, assess their likelihood and severity and detail measures to mitigate them.
You should also consider the types of data protection harms that these risks may lead to. For example, content moderation has the potential to lead to:
- loss of control of personal information through unexpected and unfair use or sharing of information;
- adverse effects on rights and freedoms, including privacy rights and rights to freedom of expression;
- financial harm to people (eg through loss of income or employment); and
- discrimination based on a moderation system’s outputs.
You should carry out a DPIA, even if you assess your processing is not likely to result in high risk. This is because it is a flexible and scalable tool which can assist your decision-making and risk mitigation. If you decide to proceed without carrying out a DPIA, you should document your decision.
If you have carried out a DPIA that identifies a high risk that you cannot reduce to an acceptable level, you must consult us before going ahead with the planned processing.
You must follow a data protection by design and default approach when you decide to use a content moderation system. This helps you consider privacy and data protection issues at the design stage of your system, and throughout its operation. Following a data protection by design approach means you must:
- put in place appropriate technical and organisational measures designed to implement the data protection principles effectively; and
- integrate safeguards into your processing so you meet the UK GDPR's requirements and protect people’s rights.
If you use automated decision-making in your content moderation systems, then there are data protection requirements that you must comply with. (See the section on ‘What if we use automated decision-making in our content moderation?’ for more information.)
- Data protection impact assessments (DPIAs)
- Examples of processing likely to result in a high risk
- Children’s code – see standard 2 for carrying out a DPIA when processing children’s information.
- Overview of data protection harms and the ICO’s taxonomy
- Data protection by design and default
- Privacy in the product design lifecycle
- Guidance on AI and data protection