Your content moderation systems may involve different organisations. For example, you may use one or more third-party providers to support your moderation processes.
You must be clear about the roles that you and each party have. This depends on which of you is a data controller or processor, or whether you are joint controllers.
You are the controller, if you’ve made decisions about the purposes and means of any content moderation processing. You have overall responsibility for complying with data protection law. This is because you’ve decided the “why” and “how” of processing your users' personal information, for example:
- what you intend content moderation to achieve;
- what personal information you need for this purpose;
- what content moderation tools are involved;
- how long you’ll keep the personal information for; and
- whether you engage another party to undertake the processing for you.
If you engage another party to undertake processing for you and they only act on your behalf and on your instructions, they will be your processor.
Processors can make certain technical decisions about how to process personal information, for example:
- which IT systems and methods to use in the content moderation;
- how to store the personal information in these systems;
- which security measures to apply to the personal information; and
- how to retrieve, transfer, delete and dispose of the personal information.
An online service decides to use a third party that provides a content moderation tool. The tool analyses user-generated content to classify whether it violates the service's content policies. This involves processing the users’ personal information.
The online service is responsible for deciding both why and how the information is processed. The third party acts only on the service's instructions, which include the specific content policies the tool has to classify against.
However, the third party uses its own expertise to carry out the moderation. It takes decisions about storing the personal information and how it transfers the results of its moderation actions back to the service.
In this case, the online service is the controller and the third-party moderation provider is the processor.
You must ensure your processor only carries out processing according to your instructions. If a processor acts outside of these instructions, it is processing your users' personal information for its own purposes. Not only does this mean it becomes a controller for that processing, but it is also acting outside of its agreement with you.
There can be some types of content moderation that are more complex, such as those that involve AI or several different entities. This can make it more complicated to allocate appropriate roles and responsibilities.