The ICO exists to empower you through information.

Below you’ll find reminders of the key data protection themes for live facial recognition (LFR) use in Law Enforcement.


Strictly necessary 

The use of LFR must be strictly necessary for the law enforcement purposes.

The ICO expects “strictly necessary” under Part 3 DPA 2018 to mean that enhanced consideration and extra care should be taken to:  

  • ensure that the processing of sensitive information is specific in nature and dependent on the specified law enforcement purpose;
  • clearly demonstrate why there are reasonably no less intrusive means of achieving the same purpose; and 
  • clearly demonstrate how such processing will be effective in meeting the specified law enforcement purposes. 

Clear purpose

The purpose and justification for the use of LFR should be clearly defined and limited, in keeping with the law enforcement purposes and principles of the DPA 2018.   


Key data protection documentation

Data Protection Impact Assessment (DPIA)

Appropriate Policy Document (APD)

Local policies and procedures



Effectiveness is a key consideration for strict necessity and proportionality. The ICO expects you to clearly explain how LFR will be effective in meeting the specified law enforcement purposes. 



The inclusion of an image on a watchlist should involve the same enhanced considerations and care for processing, eg strict necessity. 
Watchlists should be in keeping with the data protection principles, and be adequate, relevant and not excessive.
Only images that are accurate and lawfully held at the time of use should be included. 


Informing the public

Consider the level of information available at overt deployments. 

Contact details of the controller should be made available to the public. 

You must consider how to provide information to people about exercising their rights. 


Managing bias

There should be periodic testing and reviews of the technology to ensure that it remains accurate and effective towards understanding and eliminating bias.