Germany Greenlights the EU AI Act: Countdown for Enterprise Compliance
Germany has officially approved draft legislation to implement the EU’s AI Act, designating the Federal Network Agency (Bundesnetzagentur) as the central supervisory authority for AI in the country.
Overview of the Draft Legislation
Under the draft AI Market Surveillance and Innovation Promotion Act (KI-MIG), Germany will establish a national framework to regulate the development and deployment of AI systems. This draft law will now proceed to the Bundestag (lower house of parliament) and Bundesrat (upper house of parliament) for further approval.
According to Federal Digital Minister Karsten Wildberger, this law aims to implement European requirements in a manner that fosters innovation while creating a streamlined approach to AI supervision focused on economic needs.
Distributed Oversight Model
The Federal Network Agency will act as the central coordinator, market surveillance authority, and notifying body. This agency already oversees Germany’s implementation of the EU Digital Services Act and supervises major platforms such as Facebook, Instagram, YouTube, TikTok, and X.
AI oversight will be assigned to established regulators, including the Federal Cartel Office, the Federal Financial Supervisory Authority (BaFin), and data protection authorities at both federal and state levels.
As noted by Sanchit Vir Gogia, chief analyst at Greyhound Research, “The supervisory map has changed shape. It is no longer sensible to think in terms of a single regulator relationship for AI.” Germany’s decision to anchor coordination within the Federal Network Agency gives it a central gravity, although enforcement power is not centralized.
Challenges for Enterprises
This distributed approach introduces complexity for enterprises, as different AI systems may not go through the same supervisory channels. For instance, a scoring model used in HR or credit might be regulated differently than embedded systems in devices. This necessitates that enterprises develop internal classification and routing capabilities.
Germany’s regulatory approach mirrors broader trends across the EU, where countries like France and Spain are exploring coordinated decentralization. Italy has enacted a national AI law that maintains existing sector supervision channels.
Industry Response and Concerns
Industry groups have welcomed Germany’s implementation strategy while stressing the need for fundamental reforms to the EU AI Act itself. Sarah Bäumchen, managing director of the German Electrical and Digital Industries Association (ZVEI), expressed support for the Federal Network Agency’s coordinating role but highlighted that the German implementation law cannot address the EU AI Act’s severe shortcomings.
The impending August 2026 deadline raises significant concerns for companies. Bäumchen indicated that key elements, such as harmonized European standards for compliance with high-risk requirements, are currently unavailable. She calls for a 24-month extension to prevent companies from delaying or even halting the introduction of AI features.
ZVEI is advocating for industrial AI to be entirely excluded from the Act, arguing that existing regulations already provide necessary safeguards. They assert that the AI Act creates legal ambiguities by not aligning with current product safety laws, the Cyber Resilience Act, and the Data Act.
Compliance Priorities for Enterprises
Under the EU AI Act, companies are required to assess the risk levels of their AI systems and implement corresponding transparency and security measures. The regulation prohibits certain AI programs that conduct social behavior assessments and bans emotion recognition in workplaces and educational institutions.
Enterprises developing or utilizing high-risk AI systems must adhere to requirements related to transparency, data governance, documentation, robustness, and cybersecurity, with obligations commencing in the next six months.
As the deadline approaches, companies operating in Germany face the immediate challenge of establishing a “functioning compliance operating system.” Most enterprises lack a comprehensive inventory of their AI systems, which includes internal builds, vendor-embedded features, and informal deployments across business units.
Vendor governance represents a critical pressure point, as companies must ensure that suppliers can provide the necessary technical documentation and evidence of conformity assessment.
In financial services, scrutiny will focus on credit scoring and underwriting automation, while employment systems are likely to face complaint-driven enforcement due to their direct impact on individuals. Germany’s implementation includes a central complaint intake pathway, allowing enforcement to be initiated externally rather than relying solely on regulatory initiative.
Germany missed the EU’s August 2, 2025 deadline for establishing national supervisory structures due to early federal elections. The Federal Network Agency established an AI Service Desk in July 2025 and published AI literacy guidance in June 2025, considering more than 1,000 change proposals during the drafting process.