Energy Efficiency Compliance in the EU AI Act

Energy Efficiency Requirements Under the EU AI Act

The rapid proliferation of AI technologies in recent years has led to a growing demand for energy-intensive data centers and high-performance hardware. Laws seeking to regulate AI, such as the EU AI Act, increasingly include rules regarding energy consumption and transparency that will impact businesses developing or using AI technologies.

Calculating AI’s global energy consumption precisely is challenging because accurate data on energy consumption is not readily available. Nevertheless, the International Energy Agency estimates that by 2026, the global AI industry will have grown exponentially and will consume at least ten times the amount of energy that it consumed in 2023. By 2026, electricity consumption by EU data centers is expected to be 30% higher than 2023 levels, as new data facilities are commissioned amid increased AI computations and digitalization.

Against this backdrop, emerging AI regulatory laws are seeking to impose rules regarding energy use in the context of AI. One such law is Regulation (EU) 2024/1689 (the EU AI Act), which entered into force on August 1, 2024, although enforcement will take effect in stages over several years. The EU AI Act is designed to provide a common set of rules governing the development and use of AI across the EU (and, in some cases, beyond the EU). It aims to ensure environmental protection while boosting innovation and imposes a number of requirements concerning energy consumption and transparency, many of which are yet to be finalized. Therefore, businesses that develop or use AI are increasingly likely to face energy-related compliance obligations around the world in the coming years.

Transparency Requirements for General-Purpose AI Models

The EU AI Act defines General-Purpose AI Models (GPAI models) as AI models that are trained with a large amount of data using self-supervision at scale, displaying significant generality, and capable of competently performing a wide range of distinct tasks. Examples include large language models such as OpenAI’s GPT models, Google’s Gemini, and Meta’s Llama.

Under the EU AI Act, a business that develops a GPAI model (a provider) is required to create and maintain technical documentation, including a breakdown of the energy consumption of that GPAI model. Where the energy consumption is not yet known, providers may estimate the consumption based on the computational resources used. This obligation is limited to the provider of a GPAI model, distinct from deployers, which are businesses that use GPAI models developed by others.

The AI Office (an EU regulatory/advisory body) is authorized to demand technical documentation on energy consumption from providers with no prior notice. As a result, businesses developing GPAI models need to keep this documentation regularly updated. Providers of GPAI models launched before August 2, 2025, effectively benefit from a two-year grace period until August 2, 2027, to demonstrate compliance. However, providers who launch their GPAI models on or after August 2, 2025, will face immediate compliance obligations, incentivizing early launches.

Systemic Risk

The EU AI Act designates certain GPAI models as having systemic risk due to their impact on public health, safety, public security, fundamental rights, or society as a whole. Providers of GPAI models that fall within this category face significant additional compliance obligations, including evaluations, assessments, documentation, and security requirements. Energy consumption is one of the factors that can classify a GPAI model as having systemic risk, providing an incentive to keep energy usage low.

Standards for Energy Efficiency in AI

The EU AI Act requires the EU Commission to work with existing EU standards bodies and other stakeholders to create standards focused on AI. Among other things, these standards will aim at improving resource performance, including energy efficiency and consumption of other resources during the lifecycle of AI systems and GPAI models.

Currently, these EU standards on AI do not exist and will likely take time to develop. However, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) are working on international standards for environmental sustainability in AI, with a draft in the approval phase. It remains to be seen whether there will be material overlap and consistency between these standards and those to be created under the EU AI Act.

The EU AI Act mandates the EU Commission to publish periodic reports on the development of standards for energy-efficient deployment of GPAI models. The first report is not due until August 2, 2028, and may contain recommendations for legally binding corrective measures.

Voluntary Codes of Conduct

The EU AI Act requires regulators to facilitate the creation of voluntary codes of conduct that address the impact of AI systems on environmental sustainability, energy-efficient programming, and techniques for efficient design, training, and use of AI. These codes must set clear objectives and key performance indicators to measure their achievement. As of now, none of these codes has been created, so providers of AI systems and GPAI models should remain vigilant for draft codes as they become available.

Evolving Landscape

The EU Commission’s focus on energy efficiency in AI continues to evolve. In 2024, it ran a call for tenders (now closed) to measure and foster energy-efficient and low-emission AI in the EU. The EU Commission aims to explore the current and estimated future carbon footprint of AI systems and develop a framework for measuring compliance with the energy-related objectives of the EU AI Act. Additionally, it is exploring a potential AI energy and emissions label, similar to existing emissions labeling schemes.

The Energy Efficiency Directive

While the EU AI Act is the first attempt at regulating energy efficiency in AI specifically, the relevance of the Energy Efficiency Directive (EED) should not be overlooked. In line with the EU’s 2030 target of reducing greenhouse gas emissions by at least 55% compared to 1990, EU lawmakers revised the EED in 2023 to establish Energy Efficiency First as a fundamental principle of EU energy policy, giving it legal standing for the first time.

The EED requires EU Member States to consider energy efficiency in all relevant policy, planning, and major investment decisions in both energy and non-energy sectors. Although the EED does not explicitly mention AI, it emphasizes energy efficiency in the information and communication technology (ICT) sector, focusing particularly on data centers. Consequently, if a data center accommodates AI, both the EU AI Act and the EED may apply, requiring compliance with energy efficiency standards under both legislations.

The EED allows Member States to include data centers as end consumers of energy in their energy efficiency efforts. Measures may include obligations to consume renewable energy or reduce total energy consumption for computing power and other utilities. Additionally, data centers with a rated energy input exceeding 1 MW must utilize or recover waste heat, and those with an average annual energy consumption higher than 85 terajoules (TJ) over the previous three years must implement an energy management system for continuous improvement of energy efficiency.

As a European directive, the EED must be implemented into national law by the Member States by October 11, 2025. For example, Germany has enacted the Energy Efficiency Act (Energieeffizienzgesetz, EnEfG), which applies to data centers with a capacity of 300 kW or more, mandating a renewable electricity share of 50%, increasing to 100% from January 1, 2027.

The EU Taxonomy Assessment Framework for Data Centres

The European Commission’s Joint Research Centre published an Assessment Framework for Data Centres to facilitate the assessment of data centers under the EU AI Act (EU Taxonomy). This classification system defines criteria for economic activities aligned with a net-zero trajectory and broader environmental goals, enshrining rules for classifying data center-related activities concerning climate change mitigation.

Conclusion

The EU AI Act is still in its infancy, and most of its provisions, including those relating to energy consumption and transparency, are not yet in effect. Furthermore, not all EU Member States have implemented the EED into their national laws, and uncertainty remains regarding how national regulators will enforce the EU AI Act’s energy efficiency requirements. Non-compliance with relevant provisions carries potential penalties of up to the greater of €15 million or 3% of worldwide turnover. Consequently, businesses involved in the development or extensive use of AI should closely monitor developments in this space.

More Insights

Building Trust in AI: Strategies for a Secure Future

The Digital Trust Summit 2025 highlighted the urgent need for organizations to embed trust, fairness, and transparency into AI systems from the outset. As AI continues to evolve, strong governance and...

Rethinking Cloud Governance for AI Innovation

As organizations embrace AI innovations, they often overlook the need for updated cloud governance models that can keep pace with rapid advancements. Effective governance should be proactive and...

AI Governance: A Guide for Board Leaders

The Confederation of Indian Industry (CII) has released a guidebook aimed at helping company boards responsibly adopt and govern Artificial Intelligence (AI) technologies. The publication emphasizes...

Harnessing AI for Secure DevSecOps in a Zero-Trust Environment

The article discusses the implications of AI-powered automation in DevSecOps, highlighting the balance between efficiency and the risks associated with reliance on AI in security practices. It...

Establishing India’s First Centre for AI, Law & Regulation

Cyril Amarchand Mangaldas, Cyril Shroff, and O.P. Jindal Global University have announced the establishment of the Cyril Shroff Centre for AI, Law & Regulation, the first dedicated centre in India...

Revolutionizing AI Governance for Local Agencies with a Free Policy Tool

Darwin has launched its AI Policy Wizard, a free and interactive tool designed to assist local governments and public agencies in creating customized AI policies. The tool simplifies the process by...

Building Trust in AI Through Effective Governance

Ulla Coester emphasizes the importance of adaptable governance in building trust in AI, highlighting that unclear threats complicate global confidence in the technology. She advocates for...

Building Trustworthy AI Through Cultural Engagement

This report emphasizes the importance of inclusive AI governance to ensure diverse voices, especially from the Global South, are involved in AI access and development decisions. It highlights the...

AI Compliance: Copyright Challenges in the EU AI Act

The EU AI Act emphasizes the importance of copyright compliance for generative AI models, particularly regarding the use of vast datasets for training. It requires general-purpose AI providers to...