Latest updates - last updated 21 August 2023
21 August 2023 - We have added "Who is the controller for information entered on an external mobile application?" to the Controllers or processors? section.
- Is it personal data?
- Controllers or processors?
- Lawful basis
- Is this direct marketing?
- Lawful basis in generative AI
- Restricted transfers
- Special category data
- Cloud storage
- Data sharing
- Effective anonymisation
Is it personal data?
Q. Is a number plate - also known as a vehicle registration mark (VRM)- considered personal data?
Context: The business is planning to use drones and other video camera technology, to monitor road traffic to measure rates of pollution. This would involve capturing VRMs.
Answer: Yes - a VRM is personal data if it can be combined with other information that then distinguishes and allows for the identification of an individual.
Depending on why the VRM is being collected and used (the purpose), will determine whether it is considered personal data. For example, if the business monitored the length of time cars spend inside and outside of defined spaces, this may require linking the VRMs with other information and thus the singling out of individuals. Even if the business did not have access to, for example the DVLA registered keepers database, the business are still likely to be able to identify individual vehicles, their age and where they were registered.
Next steps:
- Consider the definition of personal data under UK GDPR.
- Consider if the VRM can be combined with other information that allows for an individual to be identified.
- Consider the purpose of processing the VRM.
- Decide if VRMs are anonymous in the context of your purposes.
- Decide if you need to complete a DPIA.
Additional advice:
If the information is made available to third parties, the purpose of the processing by the third party will determine whether the VRMs are personal information.
Controllers or processors?
Q. Are the providers of deep learning AI solutions processors or controllers?
Context: A UK business (Business ABC) wanted to know whether a third-party supplier offering a deep learning AI solution to them would be considered a processor or a controller for the purpose of providing an artificial intelligence led service to their customers.
Answer: If the third-party supplier (the AI developer) acts under ‘Business ABC’s’ instruction they are a processor.
If the AI solution provider processes personal data for any other reason than what they have been instructed to do, or where domestic law requires them to do so, then they will be a controller or joint-controller for this processing. For example, where they process personal data to build another model.
Next steps:
- Review our guidance on controllers and processors.
- Create clear roles and responsibilities for the different processing activities within the contract between you and the AI developer.
- Notify individuals of the processing activities through clear and easy to access privacy information to make sure transparency requirements are being met.
- Review the lawful basis being relied on.
- Review how individuals can enact their rights.
- Consider the possibility of de-identification techniques before personal data is shared with the AI developer.
Q: Who is the controller for information entered on an external mobile application?
Context: The organisation is in the health sector and want to make use of external mobile applications to assist with monitoring of and making clinical decisions about patients’ chronic conditions. The organisation advises they have limited control over what personal information is entered onto the applications.
Answer: It is likely that the organisation will be a controller for the information a patient enters onto a mobile application if they are recommending the use of these to patients, or agreeing to view and use the information entered onto them for clinical purposes. The organisation has made the decision to make use of the apps, and will be using the personal information entered upon them to make decisions about the patients as a result.
The relationship between the organisation and the provider of the app could come under a number of different controller relationships. Whether an organisation is a controller, joint controller or processor will depend on the specific use of personal information.
If the app provider will not be using the information for their own purposes, and are merely processing this on behalf of another organisation, then they will qualify as a processor, even if they have determined some non-essential means of processing.
If the app provider are looking to use the information for their own purposes, this may be either a controller to controller relationship, or they may be joint controllers with the organisation. To be joint controllers, the app provider and the organisation will need to be processing for the same or closely linked purposes, and have jointly determined how and why they are using the personal information. If they do not have jointly determined purposes and means, but the app provider intends to use the personal information for their own purposes, then they would be a separate controller.
Organisations should ensure they have assessed the risks and benefits of the use of any external applications and will need to ensure the appropriate data sharing agreements or contracts are in place, depending on whether the app provider is a processor, controller or joint controller.
Next steps:
- Review the ICO’s guidance on controllers and processors.
- Consider whether a DPIA needs to be completed.
- Ensure users of the app are provided with adequate privacy information.
Lawful basis
Q. What lawful basis should we rely on and what privacy enhancing techniques should we use when collecting, processing and storing employee diversity data?
Context:
The organisation is planning voluntary employee diversity monitoring surveys in-house. The surveys will not collect any directly identifiable information (eg employee names) but the organisation is concerned that - if the information collected is combined with information it already holds about employees – it might identify them. As some of the data collected will be special category data, the organisation wants to ensure it uses the right lawful bases and understand how to effectively anonymise or pseudonymise the data.
Answer:
Lawful basis
If the organisation can demonstrate that they meet the conditions for valid consent, they can rely on consent as their Article 6 lawful basis, and explicit consent as the condition for the processing of special category data.
However, there is a power imbalance in the employer-employee relationship. Employees may feel obligated to agree to the processing as they’re concerned about negative impacts if they don’t. Organisations need to be conscious of this when relying on consent to process employee information and take steps to ensure that the employee does not feel any pressure to consent. They should also allay any concerns over the consequences of refusing consent.
If the organisation is unable to demonstrate valid consent, it could rely on legitimate interests as the Article 6 lawful basis and substantial public interests for the processing of special category data.
Choosing which lawful basis applies depends on your specific purposes and the context of the processing. For example, your purpose may relate to a legal obligation or performing your public tasks.
Privacy protections
Our draft guidance on anonymisation explains that data protection law does not require anonymisation to be completely risk-free. However, you must be able to mitigate the risk of re-identification until it is sufficiently remote that the information is ‘effectively anonymised’.
Anonymisation means that individuals are not identifiable and cannot be re-identified by any means reasonably likely to be used. Anonymisation processes should take into account the concept of identifiability in its broadest sense, and should not simply focus on removing obvious information that clearly relates to someone. The removal of direct identifiers such as a name or an identification number is insufficient to ensure effective anonymisation.
If the organisation can identify an individual by combining the survey information with other data it holds – the information is pseudonymous not anonymous. This means the survey data is subject to data protection legislation and the organisation should consider encrypting the information and using privacy enhancing technologies (PETs). These could include pseudonymisation techniques to reduce the risk of processing, meaning the organisation could use the data for things such as statistical analysis.
Next steps
Lawful basis
- Review the ICO’s consent guidance and make sure consent and explicit consent are freely given with no pressure or consequence; or
- Review the ICO’s guidance on legitimate interest and document the reasoning, possibly through a legitimate interest assessment (LIA).
- Review our guidance on lawful basis for processing.
Privacy protections
- Review the ICO’s draft anonymisation, pseudonymisation and privacy enhancing technologies guidance.
Is this direct marketing?
Q: Would telling customers about our online fraud prevention tool be direct marketing, and therefore fall under the scope of the Privacy and Electronic Communications Regulations (PECR)?
Context:
The organisation has created a browser extension that verifies their clients’ websites as being legitimate, to protect customers against fraud. Their clients are not sure how they can let their customers know about the tool, as it may constitute direct marketing under the Privacy and Electronic Communications Regulations. The organisation also wants to know about any other compliance requirements under PECR.
Answer:
Direct marketing
Whether a communication falls under the definition of direct marketing will depend on the method by which the message is sent to a customer, and the tone, content and context of the message itself.
Direct marketing must be “directed to” particular individuals or categories of people. As such, adverts or messaging shown indiscriminately to all users of a website would not constitute direct marketing, and therefore the marketing rules would not apply. Organisations can therefore consider advertising products in this way, without needing to consider the marketing rules under the PECR.
If the communication will be directed towards particular individuals or categories of people, ie via email, telephone or text, then the content of the message itself will determine whether the PECR applies. If the message is neutral in tone, and presents a range of options that customers can take to protect themselves online, then it would be unlikely to constitute direct marketing, providing there are no other elements of the communication advertising other products and services. If however the communication focusses largely on a particular product, and is encouraging customers to buy or use this, it is likely to be direct marketing.
If the communication will be direct marketing, you will need to be aware of the marketing rules that apply to the type of marketing you wish to carry out, and ensure you have appropriate consent.
PECR Regulation 6 requirements
Aside from regulating direct marketing practices, the PECR also regulates the use of cookies and similar technologies that either store or gain access to information on a user’s device.
If your product will be using these technologies, you are required to tell your users what cookies or technologies are present, explain what they are doing and the purposes for these, before storing or accessing any information on their device.
You will also need to gain consent from your users to access or store information on their device if this is not strictly necessary to be able to provide the function of your extension. Strictly necessary means that it must be essential, and limited to what is essential to provide the service that the product offers; it does not cover any other uses that you may wish to use the data for. This consent must be gained before the storage of or access to any information on their device by means of a clear affirmative action, such as an opt-in.
The PECR and the UK GDPR work alongside each other, so it is important to be aware of any data protection obligations you may also have. If users of your online service can be singled out using information such as their IP addresses, cookie identifiers or MAC addresses, either on their own or in combination with other information, then your processing must also comply with UK GDPR. This is the case even if you cannot link the user to a named, real-world individual.
Next steps
Privacy and Electronic Communication Regulations
Review the ICO’s Direct marketing guidance and make sure appropriate consent is sought for marketing activities
Review the ICO’s guidance on Cookies and similar technologies.
Personal data
Review the ICO’s guidance on What are identifiers and related factors?
Lawful basis in generative AI
Q. Would “legitimate interests” be a suitable lawful basis when using generative AI systems to help draft responses to clients and prospective clients?
Context: The organisation wants to use a generative AI tool to draft responses to emails, which will then by reviewed by a member of staff. In some cases these emails may contain special category data.
Answer: To rely on legitimate interests, organisations must demonstrate that the use of generative AI tools is necessary for the purposes of the legitimate interests they have identified, except where the legitimate interest is overridden by individuals’ interests, rights or freedoms. This requires the organisation to demonstrate that the use of generative AI tools is proportionate, and would not infringe on the rights of their clients or prospective clients.
Consent may also be an appropriate lawful basis where organisations have a direct relationship with the individuals whose information they want to process. When relying on consent, an organisation must ensure that it is freely given, specific, informed, and involve an unambiguous opt-in. They would also need to make it easy for individuals to withdraw consent at any time.
If processing contains special category data, the organisation must ensure it has a second condition for processing in place, as required by Article 9 of the UK General Data Protection Regulation (UK GDPR). Based on the use of the generative AI tool in this case, a suitable Article 9 condition could be explicit consent.
As well as the conditions required for consent outlined above, explicit consent also requires specific confirmation through a clear statement that is separate from any other consents. We would recommend that an organisation looking to rely on consent and explicit consent does not include the explicit consent request within their contracts. Consent should also not be a requirement of using the service.
Next steps:
- Review the ICO’s blog on generative AI, and the guidance on AI and data protection and explaining decisions made with AI.
- Read our guidance on legitimate interests, consent, and special category data.
Restricted transfers
Q. What are the rules on international (restricted) transfers when using third-party suppliers based in the USA?
Context: The organisation provides a suite of revision services to students, and is looking to work with two companies (Company A and Company B) based in the USA to deliver these services. Company A provide access codes to the organisation who pass these to the students. The students then use these codes to set up a profile with Company A to access the revision services. The organisation will share personal information about their clients directly to Company B.
Answer: Even though personal information is not provided directly to Company A by the UK organisation, this would still be a restricted transfer. This is because the UK organisation’s customers enter into a contract with them only, and they remain the controller for the customer’s information. It is the organisation’s choice to use Company A’s services, not the customers.
As the information is transferred directly from the organisation to Company B, this would also be a restricted transfer.
As the USA is not currently covered by an adequacy agreement, appropriate safeguards are required when completing these transfers. Organisations must consider the safeguards and exceptions available under data protection legislation before agreeing to send personal information to countries not covered by an adequacy agreement. One way of ensuring these safeguards are in place is to use standard data protection clauses. These are clauses included in contracts between two organisations that impose obligations on both organisations to ensure personal information is protected. These can be imposed through the use of international data transfer agreements (IDTAs).
The first step in using standard data protection clauses is to complete a transfer risk assessment. This should help determine whether the personal information transferred will continue to be protected in line with UK data protection rules. It should be noted that transfers should only be used where necessary. If there is a way of achieving the same outcome without transferring personal information (eg by using anonymised information) this should be completed.
Next steps:
- Review the guidance on international transfers, and international data transfer agreements
- Complete a transfer risk assessment
Q. Does processing personal data of overseas employees of third party UK organisations count as a restricted transfer?
Context: The organisation processes payment data from a number of organisations and is looking to update their platform. For most processing activities the organisation acts as a processor. However, they act as a controller where data is collected about employees of organisations using their system. Although the new system will be for UK organisations, some employees may be based outside the UK.
Answer: Where an organisation receives personal information from a third party, transfer rules under the UK General Data Protection Regulation (UK GDPR) (Article 44) do not apply. The rules contained in the UK GDPR regarding the transfer of personal data apply only when personal information is transferred by a controller or processor to a separate organisation located outside of the UK.
In this case, a UK organisation will share employee details with the organisation that submitted the question for the purposes of accessing the primary organisation’s services. If the UK organisation’s offices located in the third countries are part of the same legal entity, and the UK GDPR is in scope because the processing activity is an activity of a UK establishment, it is unlikely to be a restricted transfer.
Next steps:
- Review the ICO’s guidance on international transfers
Special category data
Q. Could payment data be seen as special category data where it relates to payments to or from certain categories of organisation (eg political party, health organisation or trade union)?
Context: The organisation processes payment data from a number of financial organisations and is looking to update their platform.
Answer: Where payments are made to or from organisations, such as health organisations or trade unions, it may be possible to infer details about an person related to special category data. However, this would only count as special category data if the inference can be drawn with a reasonable degree of certainty, and where it is deliberately made. We would also consider this special category data if the inferences are used to treat an individual differently.
In the case of payment data as described to us here, there is not enough information collected to say with certainty what the payments are for, and the organisation is not deliberately seeking to make inferences about individuals, nor does it influence their activities in any way. Therefore, this is not special category data, and there is no need to identify an Article 9 condition to use payment data.
Next steps:
• Review our guidance on special category data
Cloud storage
Q. Are there any measures you should consider when storing data in the cloud?
Context: The organisation processes payment data from a number of organisations and is looking to update their platform.
Answer: The ICO has published cloud computing guidance which outlines what an organisation should consider when using a cloud service. The ICO has also published updated guidance on security and encryption requirements under the UK General Data Protection Regulation. This will to help organisations understand their responsibilities when storing information in the cloud, as well as provide guidance on security and effective encryption requirements.
There are also requirements for the use of cloud systems under the Network and Information System (NIS) Regulations. These regulations are designed to oversee the NIS Regulations. These are designed to address the threats posed to network and information systems, and in doing so ensure that the digital economy can function efficiently.
Any organisation that offers something-as-a-service should review the NIS Regulations to see whether they need to comply with them.
Next steps:
- Read the cloud computing guidance
- Review the ICO’s security guidance
- Identify whether you need to comply with the NIS Regulations
Data sharing
Q. Can personal data be shared with a company for the purpose of improving the performance of an artificial intelligence model?
Context: The organisation that submitted this request is looking to use an artificial intelligence (AI) powered Microsoft Word Add-In that can help with a variety of tasks. They want to know whether they would be able to share information outside of the model, not where the model learns as the software is used. The information that may be shared could include both special category data and criminal offence data, so is particularly sensitive.
Answer: By data sharing we mean the disclosure of data from one or more organisations to a third party organisation or organisations.
Data protection law facilitates data sharing when it is fair and proportionate.
- The accountability principle means you are responsible for compliance, and must be able to demonstrate compliance.
- Personal data must be shared fairly and transparently.
- You must identify at least one lawful basis for sharing information before you start any sharing.
Before sharing personal data, an organisation needs to ensure that the information they share will only be used for a specified purpose, that it will be stored securely, and will not keep it for any longer than necessary.
If an organisation is able to anonymise the information, or remove identifiable information from the documents shared, then they should do so. This would minimise the personal data shared and therefore reduce any risks.
If an organisation is unable to effectively anonymise the information, or needs to include personal data in the information they share, the ICO’s data sharing code of practice recommends the use of ‘data sharing agreements’. These set out the purpose of the data sharing, cover what happens to the data at each stage, set standards and help all the parties involved in sharing to be clear about their roles and responsibilities.
Next steps:
- Read our draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies
- Review our data sharing code of practice
Effective anonymisation
Q. How can we ensure that free text information is effectively anonymised so that it can be shared with researchers?
Context: The organisation is planning to share anonymised free text information with researchers. They plan to use technical measures to ensure at least 95% of labelled identifiers are removed, with manual checks and other measures in place to ensure that the remaining information is adequately anonymised. However, they would like advice on any additional measures that they can put in place.
Answer: When looking to anonymise personal data, organisations need to consider the likelihood of reidentification based on factors such as what the information is being used for, costs and time required to identify, the available technologies, and the state of technological development over time. Therefore, in order to effectively anonymise information, organisations need to remove anything that could be used to identify a person. This could be their name, address, date of birth, or an identification number, for example.
Organisations should also consider whether they need to remove information about people’s appearance, mental capacity, or social identity. This information could be used to reidentify individuals, particularly where outliers may exist. Depending on how gender and ethnicity are defined in records, this could be used to single out individuals.
One way of reducing the risk of reidentification could be to group some information, such as age, such that the exact information is not provided within the records. We recommend that organisations review how information that could be used to identify individuals could be aggregated or altered to mask identification.
As well as anonymising information, we recommend the use of “motivated intruder” tests, with trained staff members attempting to identify individuals from the anonymised records. Organisations should ensure that when these tests are carried out, their testers have access to all the resources an attacker would be able to access. This could include internal databases.
We recommend that organisations conduct regular checks on the effectiveness of their anonymisation process. New techniques for reidentifying individuals within datasets may be developed at any time, and new information is regularly made available. These checks will help keep processes up to date and effective.
Next steps:
- Read our draft anonymisation guidance, which is currently written in five parts: