Author: heise.de
Executive Summary
The Federal Cabinet has adopted its draft legislation to implement the European AI Regulation (AI Act). The AI Market Surveillance and Innovation Promotion Act (AI-MIG) assigns the Federal Network Agency the central supervisory role. The draft still needs to pass the Bundestag and Bundesrat, while industry associations criticize structural deficiencies in the European AI Act.
People
- Karsten Wildberger (Federal Digital Minister)
- Sarah Bäumchen (Managing Director ZVEI)
Topics
- Artificial Intelligence (AI)
- EU Regulation
- German Legislation
- Agency Structure
- Compliance and Risk Management
Clarus Lead
The federal government is implementing the European AI Regulation with a new national law. The Federal Network Agency will become the central supervisory authority for AI systems and will coordinate with existing market surveillance authorities. This concentrates supervision but also creates new responsibilities for an already heavily burdened agency. Industry representatives warn of duplicate regulations and rising compliance costs.
Detailed Summary
The German implementation law makes the Federal Network Agency the central coordination and competency authority for AI supervision in Germany. It serves as the market surveillance authority and notifying authority. To avoid duplicate structures, existing capacities at other authorities such as the Federal Cartel Office, BaFin, BSI, and data protection authorities are to be utilized.
Critical is the risk-based regulation: Companies must assess the risk level of their AI systems and take appropriate measures. Particularly high-risk applications are prohibited—such as emotion recognition in the workplace or facial recognition in public spaces (with exceptions for security authorities). The higher the risk, the stricter the requirements for transparency and security.
The selection of the Federal Network Agency is controversial. Data protection officers from the federal and state levels had called for sole AI supervision—a solution that would have given the federal states more weight. Industry representatives also criticize "fundamental design flaws" in the European AI Act. They point out that AI regulations in specialized ordinances (Machinery Regulation, Medical Device Regulation) already regulate AI safety, and this double regulation causes legal uncertainty and unnecessary compliance costs.
Key Findings
- The German AI-MIG grants the Federal Network Agency central supervision over AI development and operation
- Risk-based requirements: Higher risks lead to stricter transparency and security requirements
- Prohibited applications: Emotion recognition in work/education and facial recognition in public spaces (with exceptions)
- Federal Network Agency coordinates with existing authorities instead of creating new structures
- Industry associations call for reform due to duplicate regulations and rising compliance costs
Critical Questions
Evidence/Data Quality: Is the decision to appoint the Federal Network Agency as supervisory authority based on a capacity analysis, or was it primarily decided for efficiency reasons? What specific resources are planned?
Conflicts of Interest: Can the Federal Network Agency, which already coordinates the Digital Services Act, ensure independent AI supervision without endangering other regulatory objectives?
Causality/Alternatives: Why was the federal states coalition model (with data protection authorities) rejected? What concrete advantages does centralization offer over a federal approach?
Feasibility/Risks: How is the Federal Network Agency to avoid "double regulation" with specialized product ordinances without creating regulatory gaps?
Data Quality: How will risk classifications of AI systems be validated in practice and harmonized among different market surveillance authorities?
Conflicts of Interest: Does the focus on "innovation promotion" in the law's title support a sufficiently critical supervisory stance?
References
Primary Source: AI Act: Federal Government Launches AI Law – heise.de
Verification Status: ✓ 2024
This text was created with the support of an AI model. Editorial Responsibility: clarus.news | Fact-Check: 2024