Blog post

The CJEU SRB Ruling: Why this decision confirms confidential computing as the gold standard for GDPR-compliant data collaboration

All industries
No items found.
Written by
Matthias Eigenmann
Published on
September 29, 2025
Readtime:
0
Mountain in mist with hiker onlooker in the foregroundMountain in mist with hiker onlooker in the foreground

A landmark decision that creates new legal certainty for processing of pseudonymized data

Recommended reading

Request a live demo

Want to see what else data clean rooms can do? Have a specific use case in mind? Let us show you.

A woman with dark hair sitting on a city rooftop reading something on a tablet

The Court of Justice of the European Union's recent ruling in EDPS v. SRB (Case C-615/22 P) has been highly anticipated in the data protection community — and for good reason. By clarifying when pseudonymized data falls outside the definition of personal data under GDPR, the Court has finally clarified and fundamentally altered the risk-benefit calculation for organizations engaged in data collaboration.

Leading privacy experts are calling it a "game changer." Sebastian Kraska from IITR Datenschutz sees it as a boon for privacy-enhancing technologies in the EU. Clifford Chance's analysis confirms it creates "a direct incentive to adopt robust PETs." 

But what does this mean for organizations navigating the complex waters of compliant data collaborations? The answer lies in understanding not just what the Court said, but what technology can legally enable.

The ruling decoded: Context is everything 

At the heart of the SRB case was a deceptively simple question: When does pseudonymized data cease to be personal data?

The CJEU's answer was striking in its pragmatism. The Court re-affirmed the relative approach established in its previous case-law and clearly stated that pseudonymized data is only personal data for a specific recipient if that recipient has the "means reasonably likely to be used" to re-identify individuals. In other words, the same dataset can be personal data for one party but not for another, depending on their technical capabilities and available information.

The three pillars of the Court's reasoning

  1. Relative approach confirmed: The Court reaffirmed that the classification of data as "personal" is relative to each recipient's circumstances, not absolute.
  1. Practical assessment required: The test isn't theoretical possibility but practical likelihood — what means would reasonably be deployed by the recipient to achieve re-identification?
  1. Technical measures matter: Robust technical and organizational measures that prevent re-identification can take data outside GDPR's scope for specific recipients.

As Hunton Privacy Blog's analysis notes, "The judgment is referenced as a step toward enabling responsible innovation...thanks to clearer guidance on how PETs affect the risk of identifying individuals."

Why traditional data clean rooms suddenly look outdated 

This ruling exposes a critical vulnerability in conventional data clean room architectures. Most traditional clean rooms rely on: 

  • Access controls: Who can see what data 
  • Query restrictions: What analyses can be run 
  • Contractual agreements: Legal promises not to re-identify

But here's the problem: If the technical architecture allows for the possibility of re-identification — even if contractually prohibited — you're still processing personal data under the Court's interpretation.

Consider the typical scenario: A brand shares pseudonymized customer data with a publisher through a traditional clean room. The publisher might have:

  • Their own customer database with overlapping individuals 
  • Technical access to query results that could enable pattern matching 
  • The ability to combine outputs with external data sources

Under the SRB ruling, this data likely remains personal data for the publisher, triggering full GDPR obligations, including potential joint controllership under Article 26.

Enter confidential computing: From legal risk to technical impossibility

This is where Decentriq's approach fundamentally diverges from traditional solutions. By leveraging confidential computing technology — specifically hardware-based secure enclaves — we transform the "means reasonably likely to be used" test from a legal assessment into a mathematical proof.

The technical architecture that changes the legal equation

Hardware-enforced isolation

Data processing occurs within secure enclaves (Intel SGX, AMD SEV) where: 

  • Data remains encrypted even during computation 
  • Memory is isolated from the operating system and hypervisor 
  • Access attempts trigger hardware-level protection mechanisms

Cryptographic attestation

Before any processing begins: 

  • The enclave proves its integrity through remote attestation 
  • All parties verify the exact code being executed 
  • Cryptographic signatures confirm no tampering is possible

Technical impossibility of access

Unlike access controls that can be overridden, confidential computing makes data access physically impossible: 

  • Not even system administrators can access enclave memory 
  • Cloud providers cannot peek into processing 
  • Decentriq itself has zero visibility into customer data

What this means under the SRB test 

When data is processed in Decentriq's confidential computing environment, collaborating parties definitively lack the "means reasonably likely to be used" for re-identification because: 

  1. They never receive the pseudonymized data — only aggregated, privacy-preserved outputs (unless they specifically and expressly authorize a transfer of personal data) 
  1. Technical architecture prevents access — not through policy but through hardware-enforced cryptographic boundaries 
  1. Audit trails prove compliance — cryptographic attestation provides irrefutable evidence of data protection

As Clifford Chance's analysis emphasizes, this "risk-based approach... encourages organizations to deploy PETs to lower identifiability risk in a legally defensible way." 

Real-world impact: From compliance burden to competitive advantage

Case study: Cross-publisher campaign measurement 

When a global advertiser needed to analyze campaign performance across multiple publishers, traditional approaches would have created a data protection compliance nightmare. Each publisher would have been processing Samsung's customer data as personal data, requiring: 

  • Individual consent requirement for each data transfer 
  • Complex joint controllership agreements between the advertiser and each publisher 
  • Transfer impact assessments for cross-border transfers 
  • Ongoing compliance obligations under the GDPR and other applicable data protection regulations

Using Decentriq's confidential computing approach, publishers never accessed the advertiser’s raw customer data and processing occurred in secure enclaves with cryptographic guarantees. Additionally, only aggregated campaign metrics were generated and publishers demonstrably lacked any means to re-identify individuals, removing the advertiser’s data from the scope of data protection laws from their perspective.

The broader implications: A new era of data collaboration 

The SRB ruling, combined with confidential computing technology, opens possibilities that were previously still associated with obstructive legal uncertainty: 

  1. Targeted advertising 
    Advertisers, publishers, and other players in the digital advertising industry can combine their data to gain insights, tailor audiences, and measure campaign performance without sharing personal data. In this way, they avoid complex compliance issues inherent to traditional data collaborations. 
  1. Healthcare research acceleration 
    Hospitals and research institutions can pool patient data for breakthrough research while maintaining complete data sovereignty and GDPR compliance. Thus, the pseudonymized data never becomes personal data for research partners. 
  1. Competitor collaboration 
    Organizations can collaborate with direct competitors on market insights, fraud detection, or industry benchmarks without mutually exposing their data. Meaning each party maintains full control and sole compliance responsibility for its own data. 
  1. Financial crime prevention 
    Banks and other financial intermediaries can share transaction patterns for anti-money laundering without exposing customer information (traditionally a hinderance for such initiatives). 
  1. Regulatory reporting transformation 
    Organizations can provide detailed analytics to regulators without transferring personal data, as the regulator only receives aggregated, privacy-preserved insights.

The compliance checklist: Leveraging the SRB ruling 

For DPOs and legal teams looking to leverage this ruling, here's your action plan:

  • Assess current data collaborations: Which partnerships currently trigger joint controllership obligations? 
  • Evaluate technical architecture: Do your current solutions genuinely prevent partners from having the "means reasonably likely" for re-identification? 
  • Document technical impossibility: Can you prove — not just assert — that partners cannot access personal data? 
  • Review legal bases: Could legitimate interests replace consent if technical measures prevent re-identification? 
  • Consider confidential computing: Does your current infrastructure provide cryptographic guarantees or just contractual promises?

Expert consensus: The future is privacy-preserving

The legal community's response to the SRB ruling has been remarkably unified. As IAPP notes, “the decision is a big win for PETs and provides the long-awaited clarity for companies and data protection authorities on how to analyze data and safeguard the interests of the data subjects at the same time”. This new clarity makes it more practical and attractive to use PETs to reduce compliance burdens while still enabling legitimate data use.

William Fry's analysis goes further, calling it a "game-changing decision for data anonymisation" that fundamentally alters how organizations should approach data collaboration.

Jones Day emphasizes the practical implications: Organizations now have "a direct incentive to adopt robust PETs, as they can significantly mitigate compliance risks — provided re-identification is realistically prevented for third parties."

The bottom line: Technology that meets the moment 

The CJEU's SRB ruling represents a fundamental shift in how European data protection law views pseudonymized data. For the first time, the Court has explicitly recognized that robust technical measures can take data outside GDPR's scope for specific recipients.

But here's the crucial point: Not all privacy-enhancing technologies are created equal. The "means reasonably likely to be used" test demands more than access controls and contractual agreements. It requires technical impossibility, which is something only achievable through advanced approaches like confidential computing.

At Decentriq, we've built our entire platform around this principle. Our secure enclaves don't just limit access to data; they make unauthorized access cryptographically impossible. This is privacy engineering at its most rigorous.

As organizations navigate the post-SRB landscape, the choice is becoming clear: Embrace technologies that provide mathematical proofs of privacy, or remain tangled in the increasingly complex web of joint controllership obligations.

The Court has spoken. The technology exists. The only question is: Will you be among the leaders who seize this opportunity, or among those still wrestling with yesterday's compliance challenges?

Ready to see how confidential computing can transform your approach to GDPR compliance? Our legal and technical teams can walk you through exactly how the SRB ruling applies to your data collaboration needs. Request a consultation to explore how Decentriq's secure enclaves can unlock new possibilities while exceeding regulatory requirements. 

References

  1. Hunton Andrews Kurth LLP. (2025, September 10). EU Court of Justice clarifies definition of “personal data” in the context of pseudonymization. Hunton & Williams. https://www.hunton.com/privacy-and-information-security-law/eu-court-of-justice-clarifies-definition-of-personal-data-in-the-context-of-pseudonymization. (Hunton Andrews Kurth)
  2. Data Protection Report. (2025, February 10). CJEU Advocate General clarifies when pseudonymised data falls outside the definition of personal data. Data Protection Report. https://www.dataprotectionreport.com/2025/02/cjeu-advocate-general-clarifies-when-pseudonymised-data-falls-outside-the-definition-of-personal-data/. (Data Protection Report)
  3. Clifford Chance. (2025, September 10). Pseudonymized data after EDPS v SRB. Clifford Chance — Talking Tech (Data Privacy). https://www.cliffordchance.com/insights/resources/blogs/talking-tech/en/articles/2025/09/pseudonymized-data-after-edps-v-srb.html. (Clifford Chance)
  4. LaCasse, A. (2025, September 4). CJEU clarifies personal data definition in context of pseudonymization. International Association of Privacy Professionals (IAPP). https://iapp.org/news/a/cjeu-clarifies-personal-data-definition-in-context-of-pseudonymization. (IAPP)
  5. Moore, L., & Hayes, R. (2023, November 13). The art of staying anonymous: Game-changing decision for data anonymisation. William Fry. https://www.williamfry.com/knowledge/the-art-of-staying-anonymous-game-changing-decision-for-data-anonymisation/. (WILLIAM FRY)
  6. Jones Day. (2025, September 10). CJEU clarifies scope of personal data in EDPS v SRB decision. Jones Day Insights. https://www.jonesday.com/de/insights/2025/09/cjeu-clarifies-scope-of-personal-data-in-edps-v-srb-decision. (Jones Day)
  7. Spajić, D. (2023, June 13). Anonymous vs. pseudonymous data: The CJEU reaffirms the relative approach to the concept of personal data. KU Leuven — CiTiP blog. https://www.law.kuleuven.be/citip/blog/anonymous-vs-pseudonymous-data-the-cjeu-reaffirms-the-relative-approach-to-the-concept-of-personal-data/. (Law and Criminology Faculty)
  8. Open Loop. (2024, April 3). Prototyping Privacy-Enhancing Technologies guidance in Brazil [PDF]. Open Loop. https://openloop.org/reports/2024/04/brazil-report-pets-en.pdf. (Open Loop)
  9. InsidePrivacy. (2025, February 6). CJEU Advocate General supports pragmatic definition of personal data. InsidePrivacy. https://www.insideprivacy.com/eu-data-protection/cjeu-advocate-general-supports-pragmatic-definition-of-personal-data/. (Inside Privacy)
  10. Farah, F. A. (2024). Privacy enhancing technologies [Master’s thesis or report]. DiVA — Stockholm University / SU DiVA portal. http://su.diva-portal.org/smash/get/diva2:1880444/FULLTEXT01.pdf. (Diva Portal)
  11. Court of Justice of the European Union. (2025, September 4). Judgment: EDPS v Single Resolution Board (C-413/23 P). CURIA — InfoCuria. https://curia.europa.eu/juris/document/document.jsf?text=&docid=303863&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=16803864. (Curia)
Recommended reading

Request a live demo

Want to see what else data clean rooms can do? Have a specific use case in mind? Let us show you.

A woman with dark hair sitting on a city rooftop reading something on a tablet

Related content

Subscribe to Decentriq

Stay connected with Decentriq. Receive email notifications about industry news and product updates.