Lloyd v Google and the Collapse of UK Data Class Actions

The Case for Rethinking Collective Redress in the UK
Illustration of Google location pins highlighting UK data protection and class action challenges.

The United Kingdom has some of the strongest data protection laws in the world, but its courts have repeatedly blocked attempts to enforce those rights through data class actions. This paradox—rights on paper without practical remedies—has created an enforcement vacuum that leaves individuals without meaningful redress.

Procedural Barriers in UK Courts

The problem lies not with the UK GDPR, which permits claims for both financial loss and emotional harm, but with the structure of English civil procedure. Group Litigation Orders (GLOs) under CPR 19.21 allow claims with “common or related issues,” but they are strictly opt-in. For low-value claims—often a few hundred pounds per person—the cost of recruiting thousands of claimants outweighs any potential recovery.

Representative actions offer an opt-out model that sidesteps this problem. But UK courts have insisted that all class members share the “same interest.” In theory, a mass data breach compensation claim should qualify. In practice, the Supreme Court has taken a far narrower view.

The Turning Point: Lloyd v. Google

In Lloyd v. Google (2021, UKSC 50), Richard Lloyd sought a uniform award for four million iPhone users, arguing that Google’s secret tracking of Safari browsers amounted to a “loss of control” of personal data. The Court rejected this argument. Under the Data Protection Act 1998, it ruled, claimants must show actual damage—financial or emotional—not simply unlawful processing. It further held that harm varied across users, depending on tracking duration, data sensitivity, and use. A single judgment could not bind them all.

The ruling shut down what had been heralded as the UK’s first major collective redress in privacy law. Dozens of pending cases collapsed, and litigation funders withdrew from the market.

The Court of Appeal reinforced this strict approach in Prismall v. Google (2024) earlier this year, finding that even claims for misuse of health data require individualized proof of a reasonable expectation of privacy. Where the U.S. often treats a privacy violation as affecting an entire class, English courts insist that each individual prove harm separately.

Business Implications and Reform Risks

This judicial stance has made the UK an outlier. In the United States, privacy class actions are one of the fastest-growing areas of litigation. In the European Union, the Representative Actions Directive (2020/1828) empowers consumer associations to bring cross-border cases. In parallel, the EU’s newly adopted AI Act signals a tightening of digital accountability frameworks, further widening the regulatory gap with the UK. The UK, by contrast, maintains GDPR-level rights without a viable enforcement mechanism.

For businesses, this offers short-term relief. Apart from fines from the Information Commissioner’s Office, such as the £20 million penalty against British Airways in 2020, exposure to multibillion-pound damages claims is remote. But the security is deceptive. With no private enforcement, the ICO may pursue tougher action. Courts, meanwhile, could shift suddenly, or Parliament could lower the “same interest” test. Emerging AI liability frameworks in Europe show how quickly legal theories evolve. For corporate boards, the result is an unquantifiable tail risk—difficult to insure, model, or hedge—even as cyber insurance markets mature.

Reform Prospects and Adequacy Concerns

The government’s Data (Use and Access) Bill still before Parliament in summer 2024, focuses on economic growth and post-Brexit flexibility, not collective litigation. But divergence from EU norms carries risk. The European Commission’s adequacy decision—which permits free UK-EU data transfers—comes up for renewal in 2025. If adequacy lapses, the consequences would hit financial services and cloud providers alike.

The result is a system where rights exist but remedies do not. Tech platforms, banks, and insurers may welcome the reprieve, but policymakers should ask whether insulating corporations from accountability serves the public interest. Compared with the US and EU, the UK is increasingly isolated.

Until the “same interest” test is revisited—or a statutory collective redress mechanism is enacted—UK data protection class actions will remain a mirage: visible in theory, unattainable in practice.

Share the Post:

Related Posts

Hands typing on a laptop with floating binary code and server racks in background, symbolizing AI-generated content and digital copyright frameworks.

Copyright in the Age of Generative AI – Part I

As generative AI transforms how content is created, courts and policymakers are struggling to define authorship, ownership, and infringement. This piece explores the legal, regulatory, and conceptual challenges posed by AI-generated works and the future of copyright in an era of machine-assisted creativity.

Read More
Mexican Flag Waving Against Modern City Skyscrapers: New Antitrust Law

Take note Big Tech: Mexico just rewrote the rules of competition

Mexico’s sweeping June 30th reforms to its Federal Economic Competition Law have ushered in a new era of antitrust enforcement, creating the powerful Comisión Nacional Antimonopolio (CNA) to challenge Big Tech’s dominance through stricter merger reviews, expanded data access powers, and significantly higher financial penalties.

Read More