Skip to main content

Information Commissioner’s updated response to the Data (Use and Access) (DUA) Bill – House of Commons

The Data (Use and Access) (DUA) Bill was introduced to Parliament on 24 October 2024. The Bill has now completed its passage through the House of Lords, where it has been subject to a number of amendments and significant debate.

I welcomed the Bill as a positive package of reform that maintains high standards of data protection and protects people’s rights and freedoms. It also provides greater regulatory certainty for organisations and promotes growth and innovation in the UK economy.

Responsibility for developing public policy leading to changes in the legislative framework sits with the government and Parliament. My office is independent from government. Our role is to carry out the tasks and duties set out in the current legislative framework for data protection and any future iterations. We also provide independent, expert advice on the implications of any proposals to alter data protection and information rights law, based on our experience of regulating the existing regime.

My office has continued to work with government in the development of the Bill, in line with the requirements of article 36(4) of the UK GDPR. This response provides my comments on the amendments that have been made in the House of Lords and on some key areas of the debate.

Overall, the Bill remains one which I support as improving the effectiveness of the data protection regime in the UK, upholding people’s rights, providing regulatory certainty and clarity for organisations and improving the way the ICO regulates.

I will continue to engage with government and parliamentarians to provide my independent advice on further changes during the remaining phases of the parliamentary process.

Data protection reform

Definition of scientific research

I have previously welcomed the changes the Bill makes to the research, archiving and statistical purposes provisions. The changes make the provisions easier to navigate and understand and simplifies the requirements when organisations rely on them. In my view this will provide more certainty to organisations and empower them to use personal information responsibly. In turn, this will generate social and economic benefits, while still ensuring that people are protected.

During passage in the House of Lords, the Bill was amended so that references to the processing of personal data for the purposes of scientific research are limited to processing ‘for the purposes of any research that can reasonably be described as scientific and that is conducted in the public interest, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity.’ (emphasis added).

I note, and will reflect in guidance, that although the debate focussed on concerns about using research provisions for AI development, Parliament chose to limit the provision by adding a public interest test rather than by imposing a blanket ban. As suggested during the debate, my office can provide guidance on what is meant by ‘the public interest’ in the context of scientific research.

Duties to protect children

During the House of Lords stage, the Bill was amended to include further duties in respect of children’s data. This included an additional duty on my office to have regard to the fact that ‘children merit specific protection with regard to their personal data’. I support the intention to ensure that children are properly protected and welcome the additional clarity this provides.

Government also introduced an amendment to the article 25 (data protection by design and default) requirements. When processing personal data in the course of providing information society services likely to be accessed by children, the controller must take account of ‘higher protection matters’ when assessing what are appropriate technical and organisational measures.

Children’s ‘higher protection matters’ are defined as:

“ (a) how children can best be protected and supported when using the services, and

(b) the fact that children—

i) merit specific protection with regard to their personal data because they may be less aware of the risks and 5 consequences associated with processing of personal data and of their rights in relation to such processing, and 

ii) have different needs at different ages and at different stages of development.”

The requirement set out in (b) (i) reflects the wording already present in recital 38 of the UK GDPR and including it here should provide additional clarity.

Whilst I remain committed to ensuring that children are appropriately protected, it is important to clarify that although an organisation might need to take different steps when handling children’s data as opposed to adults, the underlying data protection principles themselves remain the same.

I do not want this amendment – or the fact that the Age Appropriate Design Code (AADC) provides specific guidance on my office’s expectations on organisations processing of children’s personal data – to suggest that the AADC reflects a higher legal standard than when processing data about adults.

I believe it would be helpful for government to provide further clarity about the policy intent behind the reference to ‘higher protection matters’ and particularly the phrase ‘how children can best be protected and supported when using the service’. This is distinct from the separate requirement to consider that children’s data merits specific protections. Our understanding is that the law expects organisations to consider the factors set out in the AADC around the best interests of the child. However, it would be helpful to have this confirmed.

The new obligation to consider ‘children’s higher protection matters’ is specific to organisations falling within the scope of the AADC. However, paragraph (4) states that this does not imply anything about potentially relevant matters to assessments under article 25 (1) by other organisations. Government has indicated that subparagraph (4) is intended to clarify that other organisations may also need to consider the higher protection matters, but the organisations will determine this on a case-by-case basis and depending on the context. I would welcome greater clarity about when the government considers that this would be necessary.

Finally, I note the new duties apply to the data protection by design requirements in article 25(1) and not the data protection by default requirements in article 25(2). In my view, “by default” flows out of “by design”. Article 25(2) is particularly relevant to the steps organisations must take to comply with the data minimisation principle. “By default” may require organisations to make different choices when designing products and services likely to be accessed by children, as demonstrated by the AADC’s requirements. I would likewise welcome additional clarity from government as to the decision to limit the new requirement for information society services to article 25(1) requirements.

Direct marketing – soft opt-in

The government made an amendment to the Bill about the rules on direct marketing by email under PECR. This will extend the direct marketing ‘soft opt-in’ to the charity sector. The ‘soft opt-in’ is currently only available to commercial organisations. It allows them to send email direct marketing to people they have an existing relationship with through the sale of goods and services, as long as they give the customer the option to opt out.

The amendment will allow charities to take advantage of the same rules where people either support their cause, eg via donations, or express an interest in their charitable purposes. I support this extension as it will help charities better communicate with people who support their purposes. However, we would expect charities to consider implementation carefully, including their UK GDPR obligations. Where organisations are relying on legitimate interests for their processing, they will need to carefully assess their interests and balance them against the impact on individual rights and freedoms. In some cases, it may not be appropriate to rely on the soft opt-in, for example where someone accesses an organisation’s crisis service and subsequently sending them direct marketing mail could result in harm.

Codes of practice

The government has committed to using their secondary legislation powers to require my office to produce two new codes of practice on solely automated decision-making and artificial intelligence and on ed-tech. I welcome this commitment and the opportunity to decide the exact scope of these codes following robust consultation with my office and with an appropriate range of wider stakeholders and with the benefit of a strong evidence base. I am confident that this is the best route to ensure that the codes have the most effective focus, provide regulatory certainty and are both impactful and deliverable.

Automated decision making (ADM)

Although no amendments to the ADM clauses were agreed whilst the Bill was in the House of Lords, this is an area of significant debate. In particular, parliamentarians expressed concerns about the reframing of the ADM provisions so that they no longer provide a general restriction, with exceptions, on automated decision making with a legal or similarly significant effect. Instead, the amendments to the Bill will allow such processing with no limitation on which lawful basis an organisation can use, subject to putting specific safeguards in place. I still think that the approach in the Bill strikes a good balance between facilitating the benefits of automation and maintaining additional protection for special category data, particularly given the importance given to the role of “meaningful human involvement” in the provisions. I also note that the broader requirements within data protection law, including the principles of fairness and transparency, will continue to apply. Where organisations rely on the legitimate interests lawful basis for processing, they will still need to demonstrate that their interests are not outweighed by the impact on the rights and freedoms of individuals whose data is being processed.

I note that many stakeholders believe that, given the potential risks of solely automated decision making, the general prohibition is an important safeguard to keep. Ultimately, this is a decision for Parliament, considering the wide range of views before making their decision. I will continue to monitor the debate with interest and ensure that my office provides timely and practical support for organisations to understand and apply the new legislation, including these provisions, once it is agreed.

Other provisions

Web crawlers

The Bill places new responsibilities on my office to regulate the transparency of web crawler use. The aim is to increase the ability of creatives to assert and enforce their copyright according to existing law. As this is a new, non-government amendment, there has been no impact assessment for this measure. My office has not been consulted about its implications, including issues such as resourcing and how it sits with existing regulatory frameworks and other regulators’ remits. I look forward to discussing this with the government so that I can properly assess and account for the implications of this new area of responsibility.

Deepfakes

Parliament have agreed new offences in respect of sexually explicit digitally produced images without consent. I support the intent behind the creation of a criminal offence and agree that this is the most effective way to address this issue. I note the statements regarding the incompatibility of the new offence with the European Convention of Human Rights. I would welcome assurance from government that they have considered and assessed any implications for the European Commission’s review of the UK’s adequacy status.

Government technical amendments

I am content that the technical amendments introduced by government whilst the Bill was in the House of Lords will ensure the effective operation of the legislation.