The ICO exists to empower you through information.

Introduction

Latest update - 21 August 2024

21 August 2024 - the summary of responses was published. 

Between April and June 2023, we ran a call for views to support the development of our guidance on how data protection law applies to online content moderation technologies and processes.

We sought information on:

  • how content moderation solutions use people’s personal data, and how the solutions are being used or developed more generally;
  • where clarification is needed on the application of UK GDPR, DPA 2018 and PECR to content moderation; and
  • what challenges organisations are facing relating to data protection when they deploy content moderation processes.

We received 15 written responses to the call for views from a range of respondents including developers of content moderation solutions, industry associations, third sector organisations, and members of the public.

A variety of issues were raised and some key themes emerged, which are summarised below.

Please note that the information on this page is a summary of the views of respondents to our call for views. It is not a reflection of the ICO’s views.

We are grateful to those who took the time to respond, and those respondents who were willing to discuss with us the points they made in further detail.

Summary of responses

Use and development of content moderation

Respondents provided insights about how content moderation solutions are being used or developed.

Types of content moderation

Respondents told us that a variety of content moderation solutions are currently in use. These can analyse various formats of content including text, video and images.

Content moderation can be carried out proactively (eg where a service proactively identifies violative content) or reactively (eg in response to reports from users).

We heard that some services build their own content moderation solutions in-house. Others utilise content moderation solutions developed by third-party vendors.

Developing solutions in-house presents a challenge in terms of both cost and expertise. We heard that services with fewer resources are more likely to rely on third-party providers.

Respondents told us that moderation may involve automated systems and human moderators. Where automated tools are used, we heard about the importance of ensuring accuracy in automated decisions and avoiding discrimination.

Terms of service

We heard that services typically tell users that content moderation is taking place. This information usually appears in a service’s terms of service or community guidelines.

We heard that terms of service or community guidelines outline what content is not permitted on a service and the action(s) that will be taken if content does not comply.

Respondents told us that services typically have processes in place for users to report content.

Content moderation action and user appeals

We heard that users are typically notified about moderation action that is taken in relation to content that they have posted. The information provided varies between different services.

We heard that services often provide mechanisms for users to appeal moderation decisions made about them.

We also heard about the requirement in the US for services to report to the National Center for Missing and Exploited Children (NCMEC), when they become aware of instances of child sexual abuse material on their service.

Challenges in the development and deployment of content moderation

Respondents told us about the following challenges in development and deployment of content moderation systems and processes.

Data minimisation and use of personal data

Some respondents highlighted a lack of clarity as to how much personal data can be collected for the purpose of content moderation. Respondents told us that collecting increased amounts of personal data is often necessary to carry out content moderation more effectively. However, many organisations also want to collect as little personal data as possible in order to comply with the data minimisation requirements of the UK GDPR. Some respondents also mentioned the implementation of the Online Safety Act as a factor that may require additional data collection. Respondents said that they would welcome increased regulatory certainty in this area.

Transparency in content moderation

We heard about the importance of providing transparency to users about content moderation systems, including provision of information about moderation action that may be taken if users breach content policies. Some respondents also told us that they thought services could go further to improve transparency around their use of moderation systems and how they work. We also heard that providing transparency to users needs to be balanced against protecting the integrity of content moderation systems: Providing too much information about the operation of content moderation systems could allow users to evade or bypass them. Some respondents also mentioned the need to protect commercially sensitive information.

Use of automated tools

Several respondents mentioned that automated tools can be used to support content moderation processes. The application of UK GDPR Article 22 to automated content moderation systems was raised as an area of uncertainty. Another challenge raised in relation to automated tools was availability or access to training data for developing automated or AI-based moderation systems.

Information sharing

We heard some uncertainty from respondents around sharing data in compliance with data protection law. Several respondents told us that effective information sharing between organisations is an important for tackling online safety harms and for the development and updating of content moderation systems.

Tackling a diverse range of harms and content formats

We heard that a key challenge for services carrying out content moderation is the wide range of possible online harms, spread across a variety of different content formats. Respondents told us that there is no one size fits all approach, and that different content formats require different approaches to moderation. This can be challenging for services that host a range of content types. Respondents also mentioned that it can be challenging for services operating globally to tailor their content moderation for different jurisdictions, for example due to greater resource requirements and different operating languages.

National and international regulation

Some respondents commented on the varying legislation relevant to content moderation that exists internationally. There was some concern about the challenges presented by the need to comply with different data protection and online safety regimes across different jurisdictions. Some respondents also highlighted the importance of cooperation between Ofcom and the ICO to provide clarity on compliance with both the data protection and online safety regimes.

ICO response to call for views

Data protection and online safety

We heard about the need for more clarity on compliance across the data protection and online safety regimes

Our response

We are committed to working with Ofcom to ensure that there is clarity on how compliance can be achieved under both regimes. In November 2022 we published a joint statement with Ofcom, which set out our joint aims of maximising coherence and promoting compliance with the regimes.

We engaged with Ofcom extensively in the process of developing our content moderation guidance to ensure that it reflected the needs of services who may need to use content moderation technology to comply with their online safety obligations.

We have been actively engaging with Ofcom as it develops its codes of practice and guidance products under the Online Safety Act. The ICO is also a statutory consultee for Ofcom’s draft codes and guidance. Our response to Ofcom’s consultation on illegal harms included our feedback on the content moderation measures recommended by Ofcom and how they align with data protection law.

Transparency requirements

We heard there was a need for clarity on transparency requirements for content moderation

Our response

We recognised the need for guidance to assist organisations in understanding their transparency obligations under data protection law. To achieve this, we included a section on transparency in our content moderation guidance that explains what information services need to provide to users when they use content moderation systems that involve processing of personal data. We also included a section in our guidance that explains services’ obligations when users request access to their personal information used in content moderation. This will assist services in understanding how they can provide transparency for users whilst ensuring the integrity of their content moderation systems.

Data minimisation in content moderation

We heard there was a need for clarity about how to apply the data minimisation principle of data protection law to content moderation

Our response

We recognised the need for specific guidance on how to apply the data minimisation principle in the context of content moderation. We included a section on data minimisation in our content moderation guidance, which explains how services can comply with their data protection obligations, including where services may need to use various types of personal information about users to make content moderation decisions.

Data sharing

We heard there was a need for clarity on data sharing in the context of content moderation

Our response

The ICO has previously published guidance for organisations on how they can share personal data lawfully as part of our Data Sharing Code. We recognised the need for content-moderation specific guidance on information sharing, and included a section on information sharing in our content moderation guidance.

We also recognised the need to provide clarity to services about the requirement to report CSEA content to law enforcement under section 66 of the OSA. The OSA requires the government to make regulations in connection with the reports that are to be made to the NCA. We plan to publish further data protection guidance about this when these regulations are implemented.

Automated content moderation

We heard there was a need for clarity on how Article 22 of UK GDPR applies to automated content moderation decisions.

Our response

We recognised the need for specific guidance on how Article 22 of UK GDPR applies to content moderation technologies and processes. The guidance includes a section on automated decision-making in content moderation, which explains when Article 22 of UK GDPR is likely to apply.

We heard about the importance of avoiding discrimination where AI and automated tools are used in content moderation systems.

Our response

We incorporated this feedback into the development of the content moderation guidance, which includes a section on how services can ensure fairness when carrying out content moderation. The ICO has also published guidance on AI and data protection. This includes detailed guidance on how services can address the risk of discrimination and bias in the development of AI systems, which will be relevant where AI technology is used in content moderation.

Next steps

We would like to thank respondents for engaging with our call for views. The responses we received informed the development of our guidance on content moderation and data protection, which we published in February 2024. We plan to keep this guidance under review and update it where appropriate, for example to reflect Ofcom’s final online safety codes of practice and guidance. We also plan to produce further guidance products on online safety technologies that will assist services in meeting their obligations under data protection law when implementing online safety measures.