Cross-Border Data Transfers for AI Development Post-Schrems II: Balancing GDPR and AI Act Requirements
(2025) JAEM01 20251Department of Law
Faculty of Law
- Abstract
- Artificial intelligence (AI) development thrives on global datasets, yet every outbound
transfer of EU personal data must now satisfy two demanding regimes; the General Data
Protection Regulation (GDPR) and the Artificial Intelligence Act (AIA). The GDPR lets
exporters rely on Adequacy Decisions, standard contractual clauses, or other Article 44-49
tools only if they can guarantee an “essentially equivalent” level of protection outside the
EEA. The AIA then layers risk-based obligations, technical documentation, automated
logging, bias testing, and meaningful human oversight onto high-risk AI systems, and these
obligations follow the system wherever its servers or operators sit, be it outside the EEA.
The thesis... (More) - Artificial intelligence (AI) development thrives on global datasets, yet every outbound
transfer of EU personal data must now satisfy two demanding regimes; the General Data
Protection Regulation (GDPR) and the Artificial Intelligence Act (AIA). The GDPR lets
exporters rely on Adequacy Decisions, standard contractual clauses, or other Article 44-49
tools only if they can guarantee an “essentially equivalent” level of protection outside the
EEA. The AIA then layers risk-based obligations, technical documentation, automated
logging, bias testing, and meaningful human oversight onto high-risk AI systems, and these
obligations follow the system wherever its servers or operators sit, be it outside the EEA.
The thesis investigates how, after the European Court of Justice’s Schrems II judgment
invalidated the EU-US Privacy Shield and intensified scrutiny of data exports, EU-based
providers can still train, fine-tune and operate high-risk AI systems on global datasets without
breaching either regime.
The analysis traces a recurring “safety chain”. It begins with a GDPR data protection impact
assessment (DPIA) that justifies any export of personal data; moves to a transfer impact
assessment (TIA) that benchmarks third country surveillance powers against EU standards
and adds encryption, pseudonymisation or other supplementary measures where gaps appear;
and culminates in the AIA’s fundamental rights impact assessment (FRIA), which merges
those findings into AI-specific obligations such as bias testing, tamper-proof logging and
human oversight.
Because the AIA’s obligations travel with the system, documentation, logs and encryption
keys must remain demonstrably under EU-level control even when the servers are located
outside the EEA, ensuring that geography never dilutes EU Charter rights. A case study of
Microsoft’s EU Data Boundary shows how regional cloud strategies can reduce, but not
eliminate, the need for strong contractual shields against third country disclosure orders.
The thesis concludes that GDPR and AIA safeguards operate as a single, integrated
compliance stack: when integrated into system architecture from day one, they allow Europe
to participate fully in global AI research while preserving the fundamental rights standards
distilled by Schrems II (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9203269
- author
- Virtanen, Ella Aurora Tuulikki LU
- supervisor
-
- Ana Nordberg LU
- organization
- course
- JAEM01 20251
- year
- 2025
- type
- H1 - Master's Degree (One Year)
- subject
- language
- English
- id
- 9203269
- date added to LUP
- 2025-06-25 08:58:03
- date last changed
- 2025-06-25 08:58:03
@misc{9203269, abstract = {{Artificial intelligence (AI) development thrives on global datasets, yet every outbound transfer of EU personal data must now satisfy two demanding regimes; the General Data Protection Regulation (GDPR) and the Artificial Intelligence Act (AIA). The GDPR lets exporters rely on Adequacy Decisions, standard contractual clauses, or other Article 44-49 tools only if they can guarantee an “essentially equivalent” level of protection outside the EEA. The AIA then layers risk-based obligations, technical documentation, automated logging, bias testing, and meaningful human oversight onto high-risk AI systems, and these obligations follow the system wherever its servers or operators sit, be it outside the EEA. The thesis investigates how, after the European Court of Justice’s Schrems II judgment invalidated the EU-US Privacy Shield and intensified scrutiny of data exports, EU-based providers can still train, fine-tune and operate high-risk AI systems on global datasets without breaching either regime. The analysis traces a recurring “safety chain”. It begins with a GDPR data protection impact assessment (DPIA) that justifies any export of personal data; moves to a transfer impact assessment (TIA) that benchmarks third country surveillance powers against EU standards and adds encryption, pseudonymisation or other supplementary measures where gaps appear; and culminates in the AIA’s fundamental rights impact assessment (FRIA), which merges those findings into AI-specific obligations such as bias testing, tamper-proof logging and human oversight. Because the AIA’s obligations travel with the system, documentation, logs and encryption keys must remain demonstrably under EU-level control even when the servers are located outside the EEA, ensuring that geography never dilutes EU Charter rights. A case study of Microsoft’s EU Data Boundary shows how regional cloud strategies can reduce, but not eliminate, the need for strong contractual shields against third country disclosure orders. The thesis concludes that GDPR and AIA safeguards operate as a single, integrated compliance stack: when integrated into system architecture from day one, they allow Europe to participate fully in global AI research while preserving the fundamental rights standards distilled by Schrems II}}, author = {{Virtanen, Ella Aurora Tuulikki}}, language = {{eng}}, note = {{Student Paper}}, title = {{Cross-Border Data Transfers for AI Development Post-Schrems II: Balancing GDPR and AI Act Requirements}}, year = {{2025}}, }