Artificial intelligence (AI) technologies are transforming multiple sectors, offering unprecedented opportunities for innovation and efficiency. At the same time, the General Data Protection Regulation (GDPR) provides a comprehensive legal framework to ensure that AI is deployed responsibly, respecting the rights of data subjects and promoting trust in emerging technologies.
In response to the growing use of AI across industries, the European Data Protection Board (EDPB) issued a significant opinion on December 18, 2024, at the request of the Irish Data Protection Authority. The opinion addresses the use of personal data in the development and deployment of AI models in compliance with GDPR principles, providing crucial guidance for developers and data controllers alike.
Key Focus Areas of the EDPB Opinion
The EDPB opinion considers:
- When an AI model can be considered anonymous.
- The conditions under which legitimate interest may serve as a lawful basis for processing personal data in AI development or use.
- Legal implications if an AI model is trained using illegally obtained personal data.
- The use of first-party and third-party datasets in AI training.
AI Lifecycle and Data Protection Considerations
The opinion divides the AI lifecycle into two phases:
- Development phase – encompassing creation, development, training data collection, preprocessing, model training, and fine-tuning.
- Deployment phase – including the use of the AI model and any operations following development.
The EDPB emphasises that personal data may be processed in any phase, triggering GDPR obligations such as lawful basis assessment, transparency, and accountability measures.
Defining AI Model Anonymity
Determining whether an AI model is anonymous is assessed case-by-case by national data protection authorities. A model is not considered anonymous if:
- It is designed to analyse personal data or provide outputs containing personal data related to identifiable individuals.
- It mimics a specific individual’s voice or behaviour.
- It was trained on datasets containing personal data without sufficient anonymisation.
To qualify as anonymous, the model must minimise the likelihood that any individual represented in the training data, or any other person indirectly affected, can be identified directly or indirectly, including through queries. Effective technical safeguards must be in place to prevent re-identification, even in combination with other data or advanced techniques.
Supervisory Authority Assessment Factors
The EDPB provides illustrative, non-exhaustive factors for authorities to consider when evaluating anonymity claims:
- Model design
- Data collection limits and source selection.
- Preprocessing steps such as anonymisation, encryption, and data minimisation.
- Methodological choices to reduce identifiability.
- Model analysis
- Likelihood of re-identification.
- Adequacy of governance and engineering controls.
- Testing and resilience
- Scope, frequency, quality of internal tests.
- Robustness against potential attacks.
- Documentation and accountability
- Risk assessments, Data Protection Impact Assessments (DPIAs), Data Protection Officer (DPO) input.
- Measures to reduce identifiability.
- Clear record of training data sources.
Practical Implications
The EDPB opinion offers practical guidance for AI developers and data controllers seeking to ensure lawful, responsible AI deployments. It highlights that GDPR-compliant lawful bases must be identified whenever personal data are processed, even in early development stages. It also clarifies the conditions under which an AI model can be considered anonymous and illustrates technical measures to support that determination.
This guidance forms an essential step in aligning AI innovation with data protection principles, ensuring that AI systems are developed responsibly, ethically, and in compliance with European privacy law.
Katona & Partners Law Firm
(Katona & Partner Rechtsanwaltssozietät / Attorneys’ Association)
H-106 Budapest, Tündérfürt utca 4.
Tel.: +36 1 225 25 30
Mobile: +36 70 344 0388
Fax: +36 1 700 27 57
g.katona@katonalaw.com
www.katonalaw.com