The ICO exists to empower you through information.

Advances in technology and software mean that surveillance systems can pose an increased risk to people’s privacy in both the public and private sectors. This section covers developing and pre-existing technologies, and also highlights additional considerations when using surveillance systems to process personal data, with good practice recommendations you should follow in order to comply with the UK GDPR and DPA 2018.

Surveillance technologies can be interconnected, which means that information can be shared or linked easily. If you are intending to match data together from different systems, you need to be careful that the information you are collecting is:

  • accurate;
  • not excessive;
  • used only for defined purposes; and
  • the use is still necessary and proportionate throughout the lifecycle of the processing.

Some systems also allow for data to be integrated into broader ‘big data’ processing systems that your organisation may operate. This has implications in terms of profiling, what you can learn about individuals and how you make decisions about them. The ICO published a report on the data protection implications of big data that covers this issue in further detail.

In detail

This guidance specifically covers:

Automatic Number Plate Recognition (ANPR)

Checklist

☐ We have identified a genuine need to read Vehicle Registration Marks (VRMs) from vehicles using public or private roads and car parks, in a way that is fair, lawful and transparent.

☐ We have conducted a Data Protection Impact Assessment (DPIA) that fully addresses our use of ANPR, and explores any impact on the rights and freedoms of individuals whose personal data are processed.

☐ We keep the number of ANPR cameras we use to a minimum, to ensure that we only use the appropriate amount in a specific area to address a particular need.

☐ We ensure that the location(s) of our cameras are fully justifiable, and are placed in such a way that they do not accidently capture any vehicles that are not of interest.

☐ We have clear and prominent signage in place to inform individuals that ANPR is in use, with sufficient detail about who to contact if they have a query.

☐ We have appropriate retention and disposal policies in place for any vehicle data we process.

☐ We have efficient governance procedures in place to be able to retrieve stored data and process it for subject access requests or onward disclosures where required.

☐ Where we process other supplementary data for the purpose of matching data obtained from cameras, we ensure that it is kept up-to-date and relevant to the purpose of the ANPR system.

☐ We comply with the Surveillance Camera code of practice where required.

Automatic Number Plate Recognition (ANPR) systems have the ability to collect and analyse large quantities of personal data in real time. Cameras process personal data when vehicles drive past their field of vision. Despite ANPR being more commonly used by law enforcement, these systems are also used by privately-owned car parks and other businesses. Millions of number plates have the potential to be scanned and cross-referenced with live databases across the UK. Due to the increasing affordability of these systems, its use in both the public and private sectors are popular.

ANPR systems generally capture:

  • images of vehicles (an overview image);
  • images of the vehicle’s number plate (the plate patch); and
  • the vehicle registration mark (VRM)

ANPR systems also commonly supplement data collected from their cameras with additional information, such as the date, time and location of the vehicle. It is therefore important that you are aware of your responsibilities around processing personal data.

Is a Vehicle Registration Mark (VRM) personal data?

A VRM is a unique mark linked to a specific vehicle, displayed on its number plate. Surveillance technologies such as CCTV and ANPR can process VRMs for law enforcement purposes or civil matters, such as parking enforcement.

In most circumstances, a VRM is personal data. However, this can depend on the context of the processing. A VRM is personal data at the point where you collect it, if you process it as part of a surveillance system for the purposes of identifying an individual (potentially to take some action, such as to serve them with a parking fine).

This is because while the VRM may not directly identify a living individual, the purpose of the system means that you are likely to find out further information. This will enable you to identify either the driver, registered keeper or both.

How should we use ANPR?

Regardless of the sector you operate in, if you are using or intend to use an ANPR system, it is important that you undertake a DPIA prior to deployment. You should show that the use is necessary and proportionate in the circumstances, and that you have minimised the risks. This is particularly important given the amount of data an ANPR system can collect in a relatively short amount of time. You should also ensure that the information your ANPR system processes is limited to what you need to achieve your purpose, and that you are able to justify your decisions surrounding the data it captures.

When storing the information and cross-referencing it with other databases to identify individuals, you need to keep these databases:

  • up-to-date;
  • accurate; and
  • of sufficient quality to prevent mismatches.

Similarly, both the cameras and any algorithms you use to determine a match must be of sufficient quality to prevent any misidentification of a VRM.

If you intend to share the personal data you process with third parties you need to make sure that doing so is lawful. We also advise you to have a data sharing agreement in place. This agreement should ensure that you have appropriate safeguards in place to keep the information secure, and that the amount of information you share is limited to what is necessary. Read further guidance about data sharing in our data sharing code.

You also need to have appropriate retention periods in place for the personal data you collect and store through your ANPR system. The retention periods should be consistent with the purpose you are collecting the data for. You should only keep the data for the minimum period necessary and should delete it once you no longer need it. For example, this could apply to personal data stored for vehicles that are no longer of interest.

Example

A gym uses an ANPR system that processes VRMs to monitor use of its car park when there is a two-hour parking limit. The system retains the details gathered from the ANPR system for those cars that have exceeded the time limit, but also about those who have not exceeded the parking limit.

It is likely there is no need to retain information for an extended period for vehicles that have adhered to the time limit. It would be unnecessary and excessive to do so, unless there was a justifiable reason. If not, the extended retention of this information is unlikely to comply with the data protection principles.

The gym would need to amend the system to ensure that they delete any information about vehicles that are not of interest, as soon as appropriate.

Signage

In keeping with the principle of fairness and transparency, it is important that you inform individuals you are processing their personal data. The best way to do this is through clear and visible signage explaining that ANPR recording is taking place and, if possible to do so, the name of the controller collecting the information. While it is a challenge to inform motorists that they are being overtly monitored, there are methods you can use, such as physical signs at entrances, posts on official websites and social media.

You must provide appropriate signs to alert drivers to the use of cameras on the road network or in areas that vehicles have access to, such as car parks. It is important that these signs do not affect the safety of road users. You should consider the amount of time the driver will have to read the information you provide; particularly where the road has a high speed limit.

Signs must make clear that cameras are in use and explain who is operating them. This means that individuals know who holds information about them and therefore have the opportunity to make further enquiries about what is happening with their data.

Further reading

We have further guidance on ‘what is personal data’.

Body Worn Video (BWV)

Checklist

☐ We have conducted a Data Protection Impact Assessment (DPIA) that fully addresses our use of BWV, and addresses any impact on the rights and freedoms of individuals whose personal data are captured.

☐ We provide sufficient privacy information to individuals before we use BWV, such as clear signage, verbal announcements or lights/indicators on the device itself and have readily available privacy policies.

☐ We train any staff using BWV to inform individuals that recording may take place if it is not obvious to individuals in the circumstances.

☐ We have appropriate retention and disposal policies in place for any footage that we collect.

☐ We have efficient governance procedures in place to be able to retrieve stored footage and process it for subject access requests or onward disclosures where required.

☐ We have the ability to efficiently and effectively blur or mask footage, if redaction is required to protect the rights and freedoms of any third parties.

☐ We comply with the Surveillance Camera code of practice where required.

Body Worn Video (BWV) involves the use of cameras that are worn by a person, and are often attached onto the front of clothing or a uniform. These devices are capable of recording both visual and audio information. Due to BWV’s increasing affordability, many different organisations in the public and private sectors can purchase and use such equipment.

BWV has the ability to capture footage and audio in close proximity to individuals, and can also be used to record in new or novel ways. This type of surveillance therefore has the potential to be more intrusive than conventional CCTV systems. Scenarios could include face-to-face on doorsteps, on public transport or inside buildings such as homes and shops. This versatility therefore increases the risk of privacy intrusion to individuals.

Before you decide to procure and deploy such a system, it is very important that you justify its use and consider whether it is necessary, proportionate and addresses a particular need. If you are going to use audio recording as well as visual recording, the collection of audio and video needs to be justifiable. The use of BWV therefore requires you to undertake a DPIA.

BWV devices have the ability to be switched on or off, but it is important to know when and when not to record. Continuous recording requires strong justification as it is likely to:

  • be excessive; and
  • capture others going about their daily business, as well as the individual who is the focus of your attention.

Some BWV devices offer the ability to continuously buffer recording, so if you turn it on it may also have recorded the previous few seconds. It is important that you ensure any buffered recording is not excessive, and you only record the amount of footage you intend to.

Remember that the presence of audio recording adds to the privacy intrusion. You will require further justification if you are thinking about recording in more sensitive areas, such as private dwellings, schools and care homes. In such circumstances, the need will have to be far greater in order for the use of BWV systems to be proportionate. The operator will need to provide more evidence to support its use in these situations.

Example

It may be appropriate for a security guard to switch on their BWV camera when they believe an individual is being aggressive towards them. However, it may not be appropriate to switch it on when an individual is merely asking for directions.

If you want to use both video and audio recording, the most privacy-friendly approach is to purchase a system where you can control and turn them on and off independently. You should consider these two types of data processing as separate data streams. Therefore you should consider controlling them separately to ensure that you do not process irrelevant or excessive data. It is important that you identify a BWV system which has the ability to be controlled in this way at the procurement stage.

If your BWV system cannot record audio and video separately, you should only use it when you can justify the recording of audio and video together in the circumstances.

Providing privacy information

If you use BWV systems, you should be able to provide sufficient privacy information to individuals. As BWV cameras can be quite small or discreet, and could be recording in fast moving situations, individuals may not be aware that they are in fact being recorded.

You should think of ways to provide further information to individuals in order to make them aware of recording. For example, you should:

  • inform or verbally announce to individuals that the recording of video or audio or both is about to take place, and the reasons why, prior to turning on the BWV device;
  • place visible signage or a warning light on the device or uniform, to indicate that the device is switched on and recording; or
  • if more appropriate in the circumstances, direct individuals to the privacy notice on your website, if you have one.

Security and governance

Due to the versatility of the technology and the specific circumstances where they can be used, BWV cameras can also process special category data that could be more sensitive. Processing special category data can increase the risks to individuals. This means you are more likely to need to do a DPIA for the processing. Due to the nature of this data, it is important that you have appropriately assessed the level of risk involved and implemented robust technical and organisational measures, including physical security, to mitigate them. Read further guidance about special category data.

For example, you should consider the use of encryption, whether this involves the device itself or the storage medium. (Where this is not appropriate, you should have other ways of preventing unauthorised access to information.) In addition, you should consider designs that have robust technical security measures, for example BWV that do not have removable memory cards, to further reduce the risk of loss or compromise of data if a device is stolen or misplaced.

The governance of the information that you collect is particularly important, as BWV cameras can process information in isolation, or as part of a larger workflow. You therefore need to make appropriate decisions about the retention and disposal of information, alongside retrieval of information and staff training. For example, staff training can be tailored for individuals who use the cameras. This could include knowing when to record, processing recorded information safely and securely, and responding to queries and requests from the general public.

You need to ensure that you can securely store all of the data you capture and have appropriate policies in place for the storage. The policy must set out:

  • how long you should keep the information for (this should be the shortest time necessary for your purpose); and
  • when you no longer require it, when to appropriately dispose of it

You should store the information so that you can easily identify, locate and retrieve recordings relating to a specific individual or event. You should also store it in a way that remains under your control, retains the quality of the original recording and is adequate for the purpose you originally collected it for.

If you will be regularly sharing recorded information with a third party, then we advise you to have a data sharing agreement in place. For further guidance, see our data sharing code.

You also need to consider any steps you must take when individuals exercise their rights, particularly when doing so could affect the rights of others. For example, responding to a subject access request for footage that involves individuals other than the one making the request. This may require you to apply video and audio redaction techniques in some circumstances. Some techniques may include blurring, masking, or using a solid fill to completely obscure parts of the footage. For further information see the dedicated sections in this guidance on redaction and responding to subject access requests.

Unmanned Aerial Systems (UAS) / Drones

Checklist

☐ We have considered whether there is a genuine need for us to use a drone, if alternative systems or methods of surveillance are not suitable to solve a particular problem.

☐ We have conducted a Data Protection Impact Assessment (DPIA) which includes the risks associated with recording at altitude, and capturing footage of individuals that are not intended to be the focus of our surveillance.

☐ We have registered our drone if the system falls within the specific criteria set by the Civil Aviation Authority (CAA). See the CAA website for further details about registration.

☐ We have robust policies and procedures in place for the use of drones, and our operators are appropriately trained, with documented credentials.

☐ We inform individuals that we are using a drone where possible, and we have an accessible privacy notice that individuals can read to learn more about our use.

☐ We comply with the Surveillance Camera code of practice where required.

What are Unmanned Aerial Systems (UAS) / Drones?

Drones are otherwise known as Unmanned Aerial Systems (UAS), Unmanned Aerial Vehicles (UAVs) and Remotely-Piloted Aircraft Systems (RPAS). They are lightweight unmanned aircraft commonly controlled by operators or onboard computers, which can also be controlled by operators. Drones can be used in many innovative ways, for example for photography, the geographical mapping of an area or searching for missing people. But they have raised privacy concerns due to their manoeuvrability and enhanced capabilities of taking photos, videos and sensing the environment. Using drones can result in the collection, use, or sharing of personal data, including information about individuals who are not the intended focus of the recordings.

Smaller models in particular, can be easily purchased online or on the high street by businesses and members of the public. There is a distinction between those individuals who can be considered as ‘hobbyists’ and are therefore generally using their device for purely personal activities, and those individuals or organisations who use the device for professional or commercial purposes.

In contrast, organisations using drones are clearly controllers for any personal data that the drone captures, and therefore are required to comply with data protection law.

Providing privacy information

A key issue with using drones is that, on many occasions, individuals are unlikely to realise they are being recorded or be able to identify who is in control. If you are a controller, you must address the challenge of providing privacy information if you decide to purchase and use such surveillance systems.

You need to come up with innovative ways of providing this information to individuals whose information is recorded, and be able to justify your approach. Or, if doing that is very difficult or would involve disproportionate effort, document this information in a way that is readily available. Some examples could involve:

  • formally registering your drone with the Civil Aviation Authority (CAA);
  • placing signage in the area you are operating a drone explaining its use; and
  • having a privacy notice on a website that you can direct people to, or some other form of privacy notice, so individuals can access further information.

How do we record information responsibly?

The use of drones has the potential for ‘collateral intrusion’ by recording images of other individuals unnecessarily. This can therefore be privacy intrusive. For example:

  • The likelihood of recording individuals inadvertently is increased, because of the height drones can operate at and the unique vantage point they afford.
  • Individuals may not always be directly identifiable from the footage captured by drones, but can still be identified through the context they are captured in or by using the device’s ability to zoom in on a specific person.

As such, it is very important that you can provide a strong justification for their use. Performing a robust DPIA will help you decide if using a drone is the most appropriate method to solve a problem that you have identified.

It is important that you can switch on and off any recording or streaming system on a drone, when appropriate. Unless you have a strong justification for doing so, and it is necessary and proportionate, recording should not be continuous. You should look at this as part of your DPIA.

As always, you should consider the wider context you are using the drone in, rather than just the use of the drone itself. For example, does it connect or interface with other systems. You should also ensure that you store any data you have collected securely. For example, by using encryption or another appropriate method of restricting access to the stored information. This is particularly important if the drone is piloted beyond visual line of sight or crashes, and there is potential for the device and the data to be lost or stolen as a result. You should also ensure that you retain data for the shortest time necessary for its purpose and dispose of it appropriately, when you no longer require it.

You may be able to reduce the risk of intrusion of others by incorporating data protection by design methods. For example, you may be able to procure a device that has restricted field of vision, or only records after the drone has reached a certain altitude. You can incorporate data protection by design and default into your DPIA and it can form part of your procurement process.

Do I have to register my drone?

Subject to certain permissions and exemptions for certain users, there are UK requirements for people who fly or are responsible for small unmanned aircraft, including drones and model aircraft. Further information about drone registration in the UK can be found on the CAA’s Drone Safe website.

Example

A local authority wishes to deploy a drone over a seaside resort to monitor public beaches for crowd movement and littering. Naturally, any visitors to the beaches may not reasonably expect to be recorded, especially if they are swimming, sunbathing or there are children present.

The local authority needs to make a strong justification for any recording, based on the sensitivity of the processing. They should take a risk-based approach by carrying out a DPIA before using the technology. This will help assess necessity and proportionality.

If recording does occur in a manner that is compliant with individuals rights and aviation rules, the local authority is required to provide the general public with appropriate information about the recording. They would also need to include information about who is responsible, how to contact them, and how individuals can exercise their rights if needed.

 

Example

A building surveyor uses a drone in a residential area to inspect damage to a roof. The surveyor wishes to use a drone because the high resolution images allow for a safer and more cost effective way of working.

In keeping with the principles of data protection law, the surveyor makes a risk-based assessment prior to deployment. They assess how to fly the drone in a way that does not affect the rights and freedoms of individuals. In order to prevent the unintended filming of residents, the surveyor only begins recording at altitude, and does not record any other private property, with the focus being on the roof.

The surveyor also ensures that, where possible, they provide individuals with links to their privacy information or website via temporary signage, and that any operators are fully trained and registered in keeping with Civil Aviation Authority (CAA) requirements.

Facial Recognition Technology (FRT) and surveillance

Checklist

☐ We have conducted a Data Protection Impact Assessment (DPIA) that fully addresses our need to use Facial Recognition Technology (FRT), the lawful basis for its use and explores the impacts on the rights and freedoms of individuals whose personal data are captured for every deployment.

☐ We fully document our justification for the use of FRT, and the decision-making behind these justifications, and they are available on request.

☐ We have ensured that a sufficient volume and variety of training data has been included to assist accurate performance.

☐ We have chosen an appropriate resolution for the cameras we use, and we have carried out full testing of the equipment.

☐ We have positioned our cameras in areas with sufficient lighting, to ensure good quality images are taken to assist accurate recognition.

☐ We are able to clearly identify false matches, and true matches.

☐ We are able to record false positive or false negative rates where appropriate.

☐ We are able to amend the system to correct false positive or false negative rates that are considered to be too high.

☐ We ensure any watchlists we use are constructed in a way that is compliant with data protection law.

☐ We have considered whether an Equalities Impact Assessment (EIA) is required to fulfil our obligations under the Equalities Act 2010.

☐ We comply with the Surveillance Camera code of practice where required.

What is facial recognition technology?

Facial recognition technology identifies or otherwise recognises a person from a digital facial image. Cameras are used to capture these images and facial recognition software measures and analyses facial features to produce a biometric template. This typically enables the user to identify, authenticate or verify, or categorise individuals. Often, the software which incorporates elements of artificial intelligence (AI), algorithms and machine learning processes estimates the degree of similarity between two facial templates to identify a match. For example, to verify someone’s identity, or to place a template in a particular category (eg age group).

FRT can be used in variety of contexts from unlocking our mobile phones, to setting up a bank account online, or passing through passport control. It can help make aspects of our lives easier, more efficient and more secure.

The concept may also be referred to using terms such as automatic or automated facial recognition (AFR) or live facial recognition (LFR) which is a type of FRT that is often used in public spaces in real time.

Depending on the use FRT involves processing personal data, biometric data and, in the vast majority of cases seen by the ICO, special category personal data. Biometric data is a particular type of data that has a specific definition in data protection law.

Biometric data”, in particular, means personal data resulting from specific technical processing relating to the physical, physio-logical or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic (fingerprint) data, as defined at Article 4(14) UK GDPR.

 

Under the UK GDPR, processing biometric data for the purpose(s) of uniquely identifying an individual is prohibited unless a lawful basis under Article 6 and a condition in Article 9 can be satisfied. Five of the conditions for processing are provided solely in Article 9 of the UK GDPR. The other five require authorisation or a basis in UK law. This means you need to meet additional conditions set out in section 10 and Schedule 1 of the DPA 2018, depending on the Article 9 condition relied upon. Read further guidance about special category data.

Further detailed information can be found in the Information Commissioner’s published Opinion about the use of facial recognition technology in public spaces.

How does facial recognition in public spaces work?

Common uses of FRT such as unlocking our mobile phones, typically involve a “one-to-one” process. This means that the individual participates directly and is aware of why and how you are using their data. Live facial recognition in public spaces is different, and is typically deployed in a similar way to traditional CCTV. This means it is directed towards whole spaces rather than specific individuals. In its most simple form, the face of an individual is scanned and cross-referenced with images from a ‘watchlist’ in order for you to determine a match. This is a bespoke gallery of individuals that could include authorised visitors, people banned from a particular premises, or in some cases a wanted criminal.

After a facial match is suggested by the system, human intervention is commonly required to assess whether the match is correct. This enables you to determine the appropriate response. The level of human intervention required can vary based on the use of the system and the risk of harm to individuals. For example, meaningful human intervention could involve deciding whether to stop an individual in a public space. In contrast, for organisations granting physical access into a premises or secure facility, human intervention may only be required to ensure the system works correctly, or allow for a second opinion.

It is likely that most systems will have an element of human decision-making built-in the process. But Article 22 of the UK GDPR establishes stricter conditions about systems that make solely automated-decisions (ie those without any human input). Systems that only support or enhance human decision-making are not subject to these conditions. But you must ensure that any human input to your processing is active, and has a meaningful influence on the outcomes. See our guidance on automated decision making.

Example

A business wishes to trial the use of live facial recognition on a large crowd of people at the entrance to a concert, in order to improve security. The faces of individuals would be scanned at the entrance, and then cross-referenced with a watchlist of persons of interest. A staff member or officer would review and scrutinise the suggested matches from the system, prior to stopping or questioning any individuals.

FRT can also be used retrospectively, in order to identify an individual from old footage or photographs. This is in contrast to using FRT in real-time in order to locate an individual in a live public setting. It is still very important to take the principles of data protection law into consideration. For example, you should ensure that the images you use for such retrospective processing are:

  • obtained lawfully;
  • used for defined and limited purposes;
  • accurate; and
  • not retained for longer than is necessary.

When using FRT and considering your compliance with the data protection principles, it is particularly important that you recognise and understand the potentially intrusive nature of the technology.

In terms of accountability, when using FRT you must be able to provide a clear explanation of:

  • the lawful basis you are relying on;
  • why you consider the use of FRT necessary in the circumstances or in the public interest;
  • why you have ruled out less intrusive options;
  • your assessment of the likelihood that the objectives of FRT (and associated processing) will be met; and
  • how you have measured its effectiveness.

In all sectors, any processing you undertake as a result of deploying FRT is likely to result in a high risk to individuals’ information rights. You should see a DPIA as a living document that you complete, update or review prior to every deployment. This means you are able to demonstrate that you have considered the risks to the rights and freedoms of individuals.

You may also wish to consider whether an Equalities Impact Assessment (EIA) is required to fulfil your obligations under the Equalities Act 2010.

How do we mitigate bias or demographic differentials?

FRT will typically incorporate machine learning and artificial intelligence. These types of systems learn from data, but this does not guarantee that their outputs will be free of discriminatory outcomes. Both developers and controllers should be mindful about the data used to train and test these systems, as well as the way they are designed and used. This is because these factors may cause them to treat certain demographics less favourably, or put them at a relative disadvantage. For example, this may be based on characteristics such as gender, race or ethnicity.

As a controller you should determine and document your approach to bias and demographic differentials from the very beginning of any use of FRT. This means that you can put appropriate safeguards and technical measures in place during the design of the FRT system.

You should also establish clear policies and procedures surrounding the data which you use to train or pilot systems. You should ensure that the data is sufficiently diverse in order to represent the population the FRT system will be used on. In order for an FRT system to be processing personal data accurately, the output of the system should be the best possible match to the facial image in question. However, this can be a significant challenge when we consider:

  • the varying quality of images that can be captured;
  • as well as the capabilities of the algorithm used; and
  • the ways that faces can be obscured or changed.

Your DPIA should explain how you have implemented effective mitigating measures, including matters relating to bias.

Further, before you procure an FRT system, you should engage with manufacturers, vendors and software developers to explore how they have prevented technical bias within their systems. This will help ensure that their products will allow you to comply with the requirements of data protection law.

How do we determine if FRT is necessary and proportionate?

It is not possible to provide an exhaustive list of all scenarios where the processing of personal data by FRT could be regarded as necessary and proportionate. You need to be able to demonstrate and document each case on its own merits. You are expected to clearly articulate the lawful use of FRT systems as part of a DPIA and “appropriate policy document”, where required by the UK GDPR and DPA 2018. This applies to the use of FRT in both public and private sectors.

Further detailed information, particularly about necessity and proportionality, can be found in the Information Commissioner’s published Opinion about the use of facial recognition technology in public spaces.

The context in which you are using such systems is a key consideration when you determine whether your use of FRT is appropriate. For example in shopping centres, you still need to be able to strongly justify that your use of FRT is lawful and necessary to achieve your outcome, and that you could not do so using less privacy intrusive methods. It may be more difficult for you to justify processing images of large numbers of individuals to only identify a few, where the need to do so or public interest is not justifiable or realistic.

If you are relying on consent to use FRT, for that consent to be valid you must ensure that you give individuals a fully informed and freely given choice whether or not to be subject to such processing. In practice, consent could prove very difficult to obtain especially in circumstances where you are using FRT on multiple individuals in public spaces. In cases where you cannot obtain appropriate consent, you must identify an alternative lawful basis to use the system on individuals.

You must also ensure that the use of FRT does not lead to individuals suffering detriment. So, for the use of FRT for authentication purposes, one example is to provide an alternative way for individuals to use a service if they do not wish to participate or consent to facial recognition processing. This could involve individuals using a unique key code or an alternative route to enter a premises.

Example

A gym introduces a facial recognition system to allow members access to the facilities. It requires all members to agree to facial recognition as a condition of entry – there is no other way to access the gym. This is not valid consent as the members are not being given a real choice – if they do not consent, they cannot access the gym. Although facial recognition might have some security and convenience benefits, it is not objectively necessary in order to provide access to gym facilities, so consent is not freely given.

However, if the gym provides an alternative, such as a choice between access by facial recognition and access by a membership card, consent could be considered freely given. The gym could rely on explicit consent for processing the biometric facial scans of the members who indicate that they prefer that option.

Smart doorbells (commercial use)

Checklist

☐ We have conducted a Data Protection Impact Assessment (DPIA) that fully addresses our need to use smart doorbells, the lawful basis for their use and explores the impacts on the rights and freedoms of individuals whose personal data are captured.

☐ We appropriately position our smart doorbell in such a way that the camera does not inadvertently record neighbouring entrances or private property, that are not the intended subject of surveillance.

☐ We ensure that any footage that we record, is kept securely and is appropriately governed in terms of retention, security, disclosure and access.

☐ We ensure that any smart doorbell apps we use, or associated software, is secure and kept up-to-date with the latest patches by checking communications from the manufacturer or vendor.

☐ We place appropriate signage to inform individuals that surveillance is in use where doorbells are located.

☐ We limit any continuous recording, and the possible intrusion of others, by only having the camera activate when the doorbell is pressed.

☐ We comply with the Surveillance Camera code of practice where required.

What is a smart doorbell?

Smart doorbells utilise the positioning of traditional doorbells, and incorporate cameras that can capture images and sometimes audio of individuals visiting or leaving your premises. Some designs can offer a high definition field of vision up to 180 degrees, which can also allow you to see individuals at the door from head to foot.

Often, smart doorbells also connect to a mobile app which can allow you to:

  • view a live feed from a remote device;
  • save and edit any recorded footage; or
  • even potentially share recorded footage with others such as insurance companies or law enforcement.

If you are considering using a smart doorbell for business purposes, it is important that you think about the capabilities of the camera prior to its use. For example, the field of vision it offers and the potential privacy intrusion of others.

As with most video surveillance systems, it is important that you carry out a DPIA that fully addresses:

  • your need to use smart doorbells;
  • the lawful basis for their use; and
  • explores the impacts on the rights and freedoms of individuals whose personal data are captured.

Some models of smart doorbells can be also equipped with facial detection and recognition technologies, which give you the capability to automatically identify an individual at your door. For example, the software may assist access through the door itself by verifying the identity of a visitor. Or more commonly, it may alert you to a specific person of interest if they visit the property. This can be done via a mobile app.

If your system is designed to uniquely identify individuals through technologies like facial recognition, then you are also likely to be processing special category data. Therefore, you need to have further protection and safeguards in place. See our guidance on special category data.

It is important that you position your smart doorbell in such a way that the camera does not inadvertently record neighbouring entrances or private property, that are not your intended subject of surveillance.

Example

The owners of a private office building wish to install a smart doorbell at their entrance to increase security, so reception staff only let in recognised clients.

The office entrance is located directly opposite another private building, and it is possible for the camera to see into the windows of the other premises.

The business owners ensure that the smart doorbell is positioned side on, in such a way that the field of vision only captures those that stand at the doorway. In addition, the owners ensure that the camera is only activated when the doorbell is pressed, so they are not using continuous recording.

They also place a sign that reminds visitors about the use of a surveillance system at the entrance to the office building.

Surveillance in vehicles

Checklist

☐ We have conducted a Data Protection Impact Assessment (DPIA) that fully addresses our use of surveillance in vehicles, and explores the rights and freedoms of both drivers and passengers whose personal data could be captured by the system.

☐ We have clear and informative signage in place within vehicles to let drivers and passengers know when video surveillance systems may be used, and who to contact in the event of a query.

☐ We only use audio recording in exceptional circumstances, and this is switched off by default if the feature forms part of the installed surveillance system.

☐ We have efficient governance procedures in place, such as for the retention of information, and we are able to retrieve stored footage and process it for subject access requests or onward disclosures to third parties where appropriate.

☐ We comply with the Surveillance Camera code of practice where required.

How is surveillance in vehicles used?

Surveillance systems in vehicles, such as inward or outward facing cameras, are often small systems that are designed to be mounted in cars (or on bikes, most often on the rider’s helmet). They record footage of the journey and any incidents that might occur.

While other commercially available action cameras allow for mounting in a car and for footage to be recorded, some surveillance systems are specifically designed for in car use. They have features such as GPS technology, and allow footage to be saved automatically to aid an investigation in the event of a crash or sudden stop.

If you are considering using surveillance systems within any of your vehicles, you should consider