All Eyes on AI Act Exemptions as Ban on High-Risk AI Systems Nears
Despite being celebrated as the world’s first comprehensive AI legislation, the European Union’s AI Act has left some questions open, particularly regarding exemptions for the use of otherwise banned AI applications for law enforcement and border control agencies.
On February 2nd, the ban on AI systems that pose “unacceptable risk” will become official. The existence of national security exemptions raises the question of whether the AI rulebook will be able to safeguard rights.
Banned AI Applications
Banned AI applications with “unacceptable risk” levels include:
- Biometric categorization systems based on sensitive characteristics
- Emotion recognition in the workplace and schools
- Social scoring
- Predictive policing
- Applications that manipulate human behavior
The European Association for Biometrics (EAB) organized a talk inviting legal experts and industry stakeholders to discuss the use of biometric data in these applications. Abdullah Elbi, a legal researcher at the Centre for IT & IP Law (CiTiP) at KU Leuven, emphasized, “It’s not an absolute prohibition, so it requires a well understanding of rules.”
The most important aspect would be having well-reasoned guidelines from the European Commission, market surveillance authorities, and data protection authorities.
Balancing Security and Rights
Although the AI rulebook seems to establish standards for AI application, it leaves balancing security and rights protections to EU countries. Governments can decide whether to introduce exceptions allowing real-time remote biometric identification in cases such as serious crimes or preventing serious threats like terror attacks.
Elbi noted the possibility of fragmentation in different member states regarding the use of remote biometric identification systems.
Criticism from Rights Groups
The law enforcement carve-outs have led to criticism from rights groups, who argue they dilute protections against potential abuse. Irina Orssich from the EU’s AI Office stated that European member states cannot negotiate aspects of the implementation of the AI Act. While individual countries can introduce stricter regulations, they cannot relax them.
“You still have a tiny bit of margin in practice because these rules will be enforced by member states’ authorities,” Orssich said.
Compliance Assessments
Another exception awaiting regulation is the evaluation of AI systems based on compliance. Providers must subject their high-risk AI systems, including biometric ones, to a conformity assessment procedure. However, the AI Act allows for certain systems to be marketed without prior conformity assessment in exceptional situations like public security.
Lydia Belkadi, another researcher at KU Leuven, explained that market surveillance authorities may allow system usage, but such authorization is limited, with conformity assessment procedures required afterward for law enforcement and civil protection authorities.
The Role of Standardization
To ensure compliance with the AI Act, standardization will play a crucial role. The European Committee for Standardization and the European Electrotechnical Committee for Standardization are working to make the standards available by the end of 2025.
Belkadi concluded, “Overall, high-risk AI systems are now subject to a more comprehensive oversight system that includes providers and deployers. These requirements are key to respecting fundamental rights.”
France’s Role in AI Act Exemptions
A recent investigation revealed that the loopholes and national security exemptions in the AI Act are the result of a campaign led primarily by France. The French administration, under President Emmanuel Macron, strategically engineered amendments to allow law enforcement and border agencies to bypass the ban on remote biometric identification in public spaces.
Countries such as Italy, Hungary, Romania, Sweden, Czech Republic, Lithuania, Finland, and Bulgaria expressed support for France’s maneuvers.
This carve-out could potentially allow climate demonstrations or political protests to come under biometric surveillance if police have national security concerns. France has been experimenting with AI-based surveillance during the Paris Summer Olympics 2024.
Consequences for Vulnerable Populations
While some experts believe these exemptions will have little real-world impact, others caution that the largest effects could be felt by vulnerable populations, who may lack the power to complain. Rosamunde van Brakel, an assistant professor at the Vrije Universiteit Brussel, stated, “In most cases, regulation and oversight only kicks in after the violation has taken place; they do not protect us before.”