The ICO exists to empower you through information.

Artificial intelligence

An umbrella term for a range of algorithm-based technologies that solve complex tasks by carrying out functions that previously required human thinking.

Anonymisation

The techniques and approaches applied to personal information to render it anonymous.

Content classification

Automated analysis of user-generated content to assess whether it is likely to breach a service’s content policies. This often involves the use of AI-based technologies. Classification systems usually assign a degree of confidence to their assessment of a piece of content.

Content moderation

The analysis of user-generated content to assess whether it meets certain standards and any action a service takes as a result of this analysis. For example, removing the content or banning a user from accessing the service.

Content removal

Action taken to remove content from a service or prevent it from being published.

Database matching

Automated analysis of user-generated content to check whether it matches an internal or external database of known prohibited content.

Feature blocking

Action taken by a service to restrict a user’s access to certain features of the service, either temporarily or permanently.

Hash

A fixed length value summarising a file or message contents.

Hash matching

A technique where a hash of a file is compared with a database of other hash functions. Online services can use hash matching to detect known illegal content.

There are different types of hash matching used in content moderation. Cryptographic hash matching is used to identify exact matches of content. Perceptual hashing is used to determine whether pieces of content are similar to each other.

Moderation action

Any action that a service takes on a piece of content or a user’s account after the content has been analysed.

Ofcom

Ofcom is the regulator for the OSA. It is responsible for implementing the regime and supervising and enforcing the online safety duties.

OSA

The Online Safety Act 2023. See Online Safety Act 2023 (legislation.gov.uk).

Pseudonymisation

Defined in the UK GDPR as “..processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”.

Service bans

Action taken by a service to ban users from accessing the service, either temporarily or permanently.

Third-party moderation services

Organisations that provide content moderation services. This can include both content moderation technologies and human moderation.

User-generated content

Defined in the OSA as “content that is (i) generated directly on a service by a user of the service, or (ii) uploaded to or shared on a service by a user of the service, and; that may be encountered by another user, or other users, of the service by means of the service”.

User-to-user service

Defined in the OSA as “an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”.

Visibility reduction

A range of actions that services may take to reduce the visibility of content. For example, preventing content from being recommended or making content appear less prominently in users’ news feeds.