The Information Commissioner’s Office (ICO) is warning organisations to assess the public risks of using emotion analysis technologies, before implementing these systems. Organisations that do not act responsibly, posing risks to vulnerable people, or fail to meet ICO expectations will be investigated.
Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture.
Examples include monitoring the physical health of workers by offering wearable screening tools or using visual and behavioural methods including body position, speech, eyes and head movements to register students for exams.
Emotion analysis relies on collecting, storing and processing a range of personal data, including subconscious behavioural or emotional responses, and in some cases, special category data. This kind of data use is far more risky than traditional biometric technologies that are used to verify or identify a person.
The inability of algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination.
Deputy Commissioner, Stephen Bonner said:
“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever.
“While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.
“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.
“The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.”
Biometric guidance coming in Spring 2023
The ICO is an advocate for genuine innovation and business growth. To enable a fairer playing field, we will act positively towards those demonstrating good practice, whilst taking action against organisations who try to gain unfair advantage through the use of unlawful or irresponsible data collection technologies.
As well as warning about the risks around emotion analysis technologies, we are developing guidance on the wider use of biometric technologies. These technologies may include facial, fingerprint and voice recognition, which are already successfully used in industry.
Our biometric guidance, which is due to be published in Spring 2023, will aim to further empower and help businesses, as well as highlight the importance of data security. Biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropriately used.
This guidance will also have people at its core, with the assistance of public dialogues held in liaison with both the Ada Lovelace Institute and the British Youth Council. These will explore public perceptions of biometric technologies and gain opinions on how biometric data is used.
Supporting businesses and organisations at the development stage of biometrics products and services embeds a ‘privacy by design’ approach, thus reducing the risk factors and ensuring organisations are operating safely and lawfully.
Further information is available in two new reports which have been published this week to support businesses navigating the use of emerging biometrics technologies.
Examples of where biometric technologies are currently being used:
- Financial companies are using facial recognition to verify human identities through comparing photo IDs and a selfie. Computer systems then verify the likelihood of the documents being genuine and the person in both images being the same.
- Airports are aiming to streamline passenger journeys through facial recognition at check-in, self-service bag drops and boarding gates.
- Other companies are using voice recognition to allow users to gain access to secure platforms instead of using passwords.
Biometric technologies are also expected to have a major impact on the following sectors:
- The finance and commerce sectors are rapidly deploying behavioural biometrics and technologies such as voice, gait and vein geometry for identification and security purposes.
- The fitness and health sector which is expanding the range of biometrics they collect, with consumer electronics being repurposed for health data.
- The employment sector has begun to deploy biometrics for interview analysis and staff training – complete our public consultation
- Behavioural analysis in early education is becoming a significant, if distant, concern.
- Biometrics will also be integral to the success of immersive entertainment.
Our look into biometrics futures is a key part of the ICO’s horizon-scanning function. This work identifies the critical technologies and innovation that will impact privacy in the near future – it’s aim is to ensure that the ICO is prepared to confront the privacy challenges transformative technology can bring and ensure responsible innovation is encouraged.
Notes to editors:
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
- Biometric technologies are: technologies that process biological or behavioural characteristics for the purpose of identification, verification, categorisation or profiling.
- If you are interested in biometric technologies and would like engage with the work we are doing in this area please email [email protected].