In September 2022, the European Commission introduced the AI Liability Directive (AILD), a draft framework for civil liability rules allowing individuals to claim damages caused by AI systems. Discussions on this directive were paused pending the adoption of the AI Act and an impact assessment report by the European Parliamentary Research Service (EPRS). With the release of this report, debate on the AILD has resumed.
The AI Act uses a risk-based approach to regulate AI systems, introducing safeguards and oversight mechanisms but does not provide private rights of action for individuals harmed by these systems. The AILD aims to address this gap by enabling individuals to seek compensation from providers or deployers of AI systems when they suffer damage due to an AI system’s output or failure, especially in high-risk cases. The AILD also facilitates access to evidence disclosure and introduces a rebuttable presumption of causality to ease proving causation between harm and AI output.
Despite its potential benefits, progress on the AILD stalled while another legislative proposal, the Revised Product Liability Directive (PLD), advanced. The PLD established broad rules for liability regarding defective digital products, including software, raising questions about whether it rendered the AILD redundant. The EPRS report sought to clarify this issue.
The report recommends proceeding with and strengthening the AILD. It identifies two key areas where the AILD addresses gaps left by the PLD: securing remedies for non-professional users of AI and facilitating redress for damages not covered by the PLD. Significant amendments suggested include:
- Extending coverage beyond just AI systems to include general-purpose AI models with systemic risk and other relevant technologies like autonomous vehicles.
- Introducing strict liability for providers causing damage through prohibited AI systems.
- Extending presumptions of causality to non-compliance with human oversight rules and general-purpose AI models.
- Transforming AILD into a Software Liability Regulation applicable universally across software applications.
The EPRS report emphasizes that while product liability frameworks are complex legal areas, the AILD offers unique opportunities for fundamental rights enforcement by enabling individual access to redress for violations affecting life, physical integrity, property, and fundamental rights—an aspect absent in both the PLD and AI Act.
Co-legislators are encouraged to consider these recommendations carefully as they hold significant potential for creating robust protections for individual rights against harms caused by AI systems.