How Can European Schools Innovate Under The EU AI Act?
Artificial intelligence (AI) is being utilized in classrooms globally, prompting a need for European schools to navigate the EU Artificial Intelligence Act effectively. A recent conference on AI in European schools brought together education leaders to discuss how to adopt AI responsibly, preparing for a future that will inevitably differ from the present.
The EU Artificial Intelligence Act
As of June 2025, key provisions of the EU AI Act are already in effect, with additional obligations set to commence in August 2026. The Act aims to ensure that AI systems are safe, fair, and transparent.
The Act classifies AI systems based on risk levels. Tools that pose an unacceptable threat to rights or safety are banned outright. High-risk systems must adhere to stringent criteria regarding transparency, data governance, human oversight, and security. Even lower-risk systems must comply with new transparency regulations.
The stakes are high: non-compliance could result in fines of up to €35 million or 7% of a company’s global turnover. These rules apply to any provider or deployer whose system reaches users in the EU, regardless of the firm’s location.
Impact on Schools
With the law generating debates, especially in the United States regarding competitiveness and innovation, European schools must find ways to adapt and meet these new standards while prioritizing student learning.
Starting with Urgency
At the conference, a sense of urgency was emphasized. Education leaders must adopt an innovative mindset to progress with hope. It’s crucial to lead with purpose, acknowledging that change can be emotionally taxing and that support for teams must be empathetic.
Unpacking the EU AI Act in Education
Experts provided insights on how schools can comply with the EU AI Act. The law does not treat all AI the same; some tools carry minimal risk, while others, like AI systems used for assessing student performance, are considered high-risk and come with stricter regulations.
Compliance is not merely about avoiding penalties; it supports the responsible and transparent use of AI in education. Educators were encouraged to take three key actions: assess, review, and comply. This involves auditing existing AI tools, understanding their functions, and evaluating their impact on student learning.
Strategy Over Hype
Another speaker highlighted the importance of tying AI use to specific educational goals. Schools should not use AI merely for operational efficiencies but to enhance personalized learning experiences. Effective implementation requires careful planning, training, and building trust among teachers.
A practical decision matrix was proposed: for each AI initiative, schools should ask whether it supports learning goals, ensures data safety, and instills teacher confidence. If any of these criteria are not met, the initiative should be reconsidered.
Informed Governance
Effective governance structures are essential for schools to make informed decisions about AI tools. Questions about who approves new technologies and monitors their effects must be addressed. Schools need to be proactive in ensuring leaders understand AI’s implications to avoid letting external forces dictate their strategies.
Mindset Over Tools
The overarching theme of the conference was the importance of adopting a mindset focused on innovation rather than simply acquiring tools. Schools should prioritize setting clear goals, engaging with the entire education community, and being transparent about their AI practices.
In conclusion, the successful integration of AI into European schools hinges on a thoughtful and strategic approach. Schools must leverage existing resources, embrace learning opportunities, and invite community collaboration to navigate this transformative era effectively.
Note: This article provides general information about the EU AI Act and should not be considered legal advice. Educational institutions should consult qualified legal counsel for specific compliance guidance.