Does all Pseudonymous Data = Personal Data?

On 4 September 2025, the Court of Justice of the European Union (CJEU) issued a landmark judgment in EDPS v. Single Resolution Board (SRB).

The decision addresses one of the most fundamental concepts in EU data protection law: what constitutes “personal data” and how far transparency obligations extend when pseudonymized data is shared between institutions.

Case Background

The dispute originated during the 2017 resolution of a Spanish bank. To determine whether shareholders and creditors were entitled to compensation, the SRB appointed Deloitte as an independent valuer.

As part of the process, affected individuals were invited to submit comments. These submissions were pseudonymized by the SRB using unique codes before being transferred to Deloitte. In other words, Deloitte received the content of the comments but not the identities of the individuals behind them.

Several participants objected, arguing that Deloitte and Banco Santander had nonetheless gained insight into their legal strategies. They filed a complaint with the European Data Protection Supervisor (EDPS), the EU’s supervisory authority for data protection within Union institutions.

In 2020, the EDPS reprimanded the SRB for failing to identify Deloitte as a recipient of the data. The SRB appealed to the General Court, which in 2022 annulled the reprimand. The General Court found that from Deloitte’s perspective, the information did not amount to personal data, and therefore the SRB had not breached its obligations. The EDPS, supported by the European Data Protection Board (EDPB), brought the matter before the Court of Justice of the European Union (CJEU or Court)).

The CJEU’s Reasoning

The Court decision engages with years of case law on identifiability, from Breyer (IP addresses) to OLAF (press releases), Scania (vehicle identification numbers), and IB Europe. It draws these strands together while adding several important clarifications.

1. Relative nature of personal data
The Court confirmed that the same dataset can be personal for one party but anonymous for another. From the SRB’s perspective, the comments remained personal data because it retained the key to re-identify individuals. For Deloitte, which had no means of reversing the pseudonymization, the data was effectively anonymous.

This recognition of relativity is not new but was made explicit: identifiability must always be assessed in light of the means “reasonably likely” to be used by the specific entity in question.

2. Transparency obligations apply at collection
Despite Deloitte’s lack of ability to re-identify, the Court held that the SRB was obliged to inform individuals that Deloitte would receive their data. Transparency must be assessed at the moment of collection, when the data is clearly personal for the controller.

Because the processing was based on consent, the Court emphasised that individuals must know all potential recipients at the outset to make an informed choice. Even if a recipient cannot identify them, the controller must disclose its involvement.

3. Anonymisation is not absolute
The Court underscored that pseudonymization or anonymisation does not automatically remove GDPR obligations. Controllers must assess whether third parties could reasonably re-identify individuals, taking into account technical and organisational measures as well as the broader context of the transfer.

4. Broad notion of “information”
 Reaffirming earlier rulings, the Court held that personal data is not limited to factual details. It can also include opinions, commentary, or the mere authorship of a document. This ensures that information remains linked to its author even when stripped of explicit identifiers.

Practical Consequences

The judgment has important consequences for EU institutions, private organisations, and emerging technologies:

●      Controllers’ responsibilities: Controllers must assume obligations based on their own ability to re-identify data, regardless of whether recipients can do so. This means that when pseudonymized data is shared, controllers remain responsible for transparency and other GDPR requirements up to the point of transfer.

●      Limits of pseudonymization: The decision confirms that pseudonymization reduces risk but does not remove legal responsibility. Organisations relying on privacy-enhancing techniques must recognise that these methods do not automatically take data outside the GDPR’s scope.

●      Third-party assessments: Before transferring data, controllers must consider whether recipients could reasonably re-identify individuals. This may require contractual safeguards, technical measures, or more cautious decisions about data sharing.

●      Implications for AI and data-driven research: By confirming that opinions and authorship constitute personal data, the ruling could influence how training datasets, research materials, and synthetic or machine-generated outputs are treated under data protection law. AI developers and researchers will need to evaluate whether apparently anonymised or synthetic data still qualifies as personal under the Court’s broad approach.

●      Transparency and consent: Where consent is the legal basis for processing, controllers must provide particularly detailed information. This ruling reinforces the principle that data subjects must be able to make an informed choice at the outset, even where recipients cannot realistically identify them.

Looking Ahead

The case will now return to the national court for a final ruling, but the CJEU’s decision has already set the parameters for organisations must approach pseudonymization and transparency.

At its core, the ruling signals three lessons:

  1. Personal data is a relative concept, defined by the perspective and means available to each actor.

  2. Transparency obligations arise at the moment of collection and cannot be avoided by pointing to downstream anonymisation.

  3. Privacy-preserving techniques such as pseudonymization are useful, but they do not exempt controllers from GDPR duties.

The decision is therefore likely to shape compliance strategies not only for financial institutions and EU bodies, but also for researchers, technology companies, and AI developers across the EU.

Article by Marina Danielyan.

Next
Next

Meta Cracks Down: 6.8M WhatsApp Accounts Removed for Scams