Blogpost

CJEU Confirms the ‘Right to Explainability’ in Automated Decision-Making: What it means for your business

CJEU Confirms the ‘Right to Explainability’ in Automated Decision-Making: What it means for your business

Understanding the context: The GDPR’s demand for transparency in Automated decision-making

The application of art. 15(1)(h) GDPR which mandates that a controller should provide ‘meaningful information’ to the data subject when it comes to automated decision making (ADM) was always ambiguous and up for debate.

In their recent case C-203/22 Dun & Bradstreet, the CJEU finally shed some light on the existence of a ‘right to explainability’ and the protection of trade secrets at once.

In this blogpost, we will provide an oversight of the most important parts of this judgement and how it affects your organisation.

The Dun & Bradstreet Case: When trade secrets clash with transparency

The case involved CK, an individual who was denied a mobile phone contract due to an automated credit assessment performed by Dun & Bradstreet Austria GmbH (D&B). CK requested meaningful information about the logic behind the decision under Article 15(1)(h) GDPR. D&B refused to provide detailed information, citing trade secret protection.

CK took the matter to the Austrian Data Protection Authority, which ruled in her favour, ordering D&B to disclose the relevant logic. D&B appealed, arguing that their scoring system was a trade secret and did not have to be disclosed. The case was eventually referred to the CJEU, which had to determine the extent of transparency required under GDPR and whether trade secrets could limit a data subject’s right to explanation.

CJEU’s Key Ruling: What companies must (and don’t have to) disclose

On 27 February 2025, the CJEU ruled:

  • No obligation to disclose the algorithm itself: Companies do not have to share their algorithm with individuals (para. 59). Instead, they must provide information that is “concise, transparent, intelligible and easily accessible” (Article 12(1) GDPR). Bottomline: explainability, not disclosure of the algorithm. This decision effectively confirms the ‘right to explainability’.
  • Explaining the impact of data changes: Companies must inform individuals how changes in their personal data could have led to a different outcome (para. 62). This ensures people understand how their personal data affects the decision.
  • Balancing transparency and trade secrets: While individuals have a right to information, businesses can withhold details that qualify as trade secrets. However, this is not an automatic exemption (para. 68). The right of access “should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property” (para. 69 & 71, recital 63 GDPR).
  • Supervisory Authority oversight: If a company refuses to disclose information due to trade secret protection (as per above), it might still have to provide the details to a Supervisory Authority (SA) or court, which will assess whether withholding the information is justified (para. 74).

What this means in practice

For businesses

Companies using ADM are under no obligation to share their algorithms, but they do need to explain decision-making in a way that is easy to understand. This means providing clear examples of how different data inputs could lead to different results. The baseline is that an organisation must be able to explain its ADMs.

However, businesses must critically assess how they want (if at all) share their trade secrets with regulators if requested to do so. Do regulators provide adequate security to send them your trade secrets directly? Do you give regulators the possibility to look into your trade secrets on premise? A delicate question.

For data subjects

People affected by automated decisions now have more clarity on their rights. While they cannot demand access to an algorithm, they can ask for an explanation of how their personal data influenced the decision. Data subjects now possess a ‘right to explainability’. This makes it easier to challenge unfair or incorrect assessments.

For SAs

Supervisory Authorities will play a role in balancing transparency and business confidentiality. They may see an increase in cases where companies refuse to disclose information, requiring a careful assessment of trade secret claims. However, they must also assess how they want to process trade secrets, how they can be shared with the SAs.

Next steps

  • Develop clear, user-friendly explanations for your automated decision-making processes.
  • Establish a procedure and secure way to share trade secrets with SAs.
  • Ensure privacy policies and responses to access requests reflect these legal clarifications.

The ruling aims to strike a balance between protecting intellectual property and ensuring transparency. While businesses can safeguard their algorithms, they must still provide clear and comprehensible explanations to comply with GDPR, fortifying the rights of the data subject.

Share this:

Written by

Enzo Marquet

Enzo Marquet

Hi! How can we help?

In need of internal privacy help or an external DPO? Reach out and we’ll look for the best solution together with you.