UK modernizes data protection with new automated decision framework

UK introduces streamlined rules for AI systems and personal data processing under 2025 legislation.

The United Kingdom has introduced comprehensive changes to its data protection framework through the Data (Use and Access) Act 2025 (DUAA), which will modernize how organizations handle personal data and implement automated decision-making systems. The Act will come into force in stages. Details of the regulations and exact dates that each measure will come into force will be available on GOV.UK.

The legislation represents the most significant update to UK data protection laws since the implementation of the UK General Data Protection Regulation, introducing new provisions for scientific research, automated decision-making, and international data transfers. The DUAA maintains existing protections while enabling organizations to operate more efficiently in an increasingly digital economy.

Summary

Who: The UK government has introduced the Data (Use and Access) Act 2025, affecting all organizations processing personal data in the UK, including businesses, researchers, and law enforcement agencies.

What: Comprehensive updates to UK data protection law covering automated decision-making, scientific research, children's data protection, international transfers, and subject access requests.

When: The Act will be implemented in stages with specific commencement dates to be announced on GOV.UK.

Where: The legislation applies across the United Kingdom and affects data processing activities conducted by UK-based organizations and those targeting UK data subjects.

Why: The Act aims to modernize data protection law to enable technological innovation while maintaining individual privacy protections, addressing gaps in existing legislation that have created uncertainty for organizations and hindered beneficial uses of automated systems.

Automated decision-making receives major overhaul

One of the most substantial changes addresses automated decision-making systems used by businesses and government agencies. This measure facilitates the responsible use of automation to help grow the economy and enable a modern digital government. With stringent safeguards in place, it creates a more permissive framework for making decisions based solely on automated processing that have legal or similarly significant effects for individuals.

The new framework requires organizations to provide data subjects with information about significant decisions made about them, enable individuals to make representations about and challenge such decisions, and ensure human intervention is available. These safeguards include: providing data subjects with information about significant decisions made about them; enabling individuals to make representations about and to challenge them; as well as enabling them to obtain human intervention in the taking of the decision.

The previous rules related to solely automated decision-making were framed as a general prohibition on decision-making of this nature, except where certain limited conditions apply. These rules were complex to navigate, leaving organisations unclear when they could engage in such activity. This hindered the use of automated decision-making that can enhance productivity and make people's lives easier.

For law enforcement agencies, the Act introduces specialized provisions allowing exemptions from safeguards in specific circumstances. Active Human Review exemption: This change means that individuals can continue to have confidence that the correct decisions are being made about them, whilst avoiding the risk of undermining an investigation by tipping off a suspect that they are of interest.

The legislation addresses longstanding uncertainties around scientific research by clarifying definitions and expanding permissible activities. This measure makes it clearer when you can use personal data for scientific research, and statistical purposes. Amongst other things, the measure clarifies that the definition of research is inclusive of commercial scientific research – for instance, a pharmaceutical company conducting vaccine research.

Researchers can now rely on broad consent for studies where precise purposes may not be fully defined at the outset. This measure allows researchers to rely on broad consent, subject to certain conditions such as consistency with relevant ethical standards. If a researcher is unclear of the precise purpose of a study at its start, they can ask for consent for an area of scientific research (e.g. the study of certain diseases).

This measure brings together the conditions which must be met for processing under the research provisions. These safeguards include respect for the principle of data minimisation, as well as preventing processing which leads to decisions being made about, or substantial harm caused to, data subjects.

Children's data protection strengthened

New obligations specifically target online services likely to be accessed by children. This measure introduces a new duty for information society services that are likely to be accessed by children, building on existing obligations under Article 25 of the UK GDPR. It requires these services to take account of the "children's higher protection matters" specified in the new Article 25(1B) of the UK GDPR when designing processing activities carried out when providing services to children.

These requirements address how services can better protect and support children, recognizing that young users may be less aware of data processing risks and have different needs at various developmental stages.

New lawful basis for legitimate interests processing

The Act creates a new legal ground for processing personal data that should reduce compliance burdens for certain activities. This measure creates a new lawful ground for processing personal data under Article 6 of the UK GDPR. It is designed to give non-public bodies greater confidence about processing personal data for a limited number of "recognised legitimate interests". These include processing that is necessary for crime prevention, safeguarding vulnerable people, responding to emergencies, safeguarding national security or assisting other bodies deliver public interest tasks that are sanctioned by law.

While the requirement for the processing to be necessary remains, the need for a detailed legitimate interests assessment which balances the data controller's interest against the individual's interest has been removed. This is in recognition of the societal value of the processing in specified situations and the potential negative impacts of any delay

Subject access requests receive procedural updates

Organizations will benefit from clearer rules around responding to data subject requests. These measures clarify rules around subject access requests for organisations and individuals. They make provisions on time limits to respond to data subject requests; and codify existing case law around reasonable and proportionate searches.

The Act introduces a "stop the clock" provision that allows organizations to pause response deadlines when they need additional information from data subjects. The Act introduces a "stop the clock" provision which will allow organisations to pause the response time – without the risk of missing the deadline – if they need data subjects to clarify or refine their requests or to provide more information. Once the organisation has the information they need, the response time continues.

International data transfer rules simplified

The legislation updates requirements for transferring personal data outside the UK, introducing new standards for adequacy decisions. The measures introduce a new data protection test to be applied by the Secretary of State when deciding whether to approve data transfers to a third country or international organisation. The test is whether the third country or international organisation has a standard of data protection which is "not materially lower" than the standard in the UK.

These measures also introduce a data protection test for data exporters when using alternative transfer mechanisms such as standard contractual clauses or other appropriate safeguards. The data protection test is met if, after an international transfer, the level of protection for a data subject will be "not materially lower" than under UK law.

Enforcement and compliance implications

For law enforcement agencies, the Act aligns national security exemptions across different data protection regimes and enables joint working arrangements with intelligence services. This measure amends the previous national security restrictions in the law enforcement regime to mirror those available under the UK GDPR and intelligence service regimes.

This provision will enable a qualifying competent authority, such as Counter Terrorism Policing, to form a joint controllership with the intelligence services for specific processing. The Secretary of State will designate such processing through a 'designation notice' only where they are satisfied it is required for the purpose of safeguarding national security.

The legislation also removes certain administrative requirements that have proven ineffective. This reform only removes the requirement for those processing under the law enforcement regime to record a justification and retains the other requirements to record the time, date and so far as possible, the identity of the person who accessed or disclosed the data.

Industry context and implications

The new legislation arrives as UK organizations increasingly adopt artificial intelligence and automated systems for business operations. Recent developments across Europe have highlighted the tension between data protection requirements and technological innovation, with German courts recently approving Meta's AI training using public dataand Dutch authorities establishing comprehensive AI guidelines.

The marketing industry has been particularly affected by evolving data protection requirements, with recent enforcement actions including privacy advocacy groups pursuing court challenges against data protection authorities over inconsistent enforcement and significant fines for transparency failures.

The DUAA's approach to automated decision-making reflects broader regulatory recognition that prescriptive restrictions may hinder beneficial technological applications while still requiring appropriate safeguards for individual rights.

Timeline