EU AI Act Compliance: Guidance from the Spanish AI Regulator
On December 10, 2025, the Spanish supervisory authority for the EU AI Act (Agencia Española de Supervisión de Inteligencia Artificial, or AESIA) published a comprehensive set of 16 guidelines and non-binding checklists designed to assist companies in navigating their obligations under the AI Act, which came into force in August 2024.
Overview of the Guidelines
The guidelines cover a variety of essential topics to ensure compliance with the AI Act.
1. Introduction to the AI Act
Guidelines No. 1 (26 pages) provide an overview of the main principles of the AI Act, including its risk-based approach, along with prohibitions and obligations that vary according to the risk level of an AI system. The guidelines also clarify the roles of economic operators and key obligations such as AI literacy and transparency requirements.
2. Practical Examples for Understanding the AI Act
Guidelines No. 2 (21 pages) offer real-world examples of AI systems, illustrating how the law’s obligations apply. Scenarios include biometric identification systems in the workplace and AI tools for HR management.
3. Conformity Assessments
Guidelines No. 3 (47 pages) detail the requirement for conformity assessments. This section explains the assessment process, recommended formats, and steps for meeting compliance requirements.
4. Quality Management Systems
Guidelines No. 4 (44 pages) outline the necessary elements for establishing a quality management system specifically for high-risk AI systems.
5. Risk Management Systems
Guidelines No. 5 (63 pages) describe the essential components required for implementing a risk management system for high-risk AI systems.
6. Human Oversight
Guidelines No. 6 (36 pages) explain how to incorporate human oversight obligations in the design and development of AI systems.
7. Data and Data Governance
Guidelines No. 7 (79 pages) provide instructions on managing data for AI systems, including training, validation, and testing datasets.
8. Transparency
Guidelines No. 8 (56 pages) clarify how to implement the AI Act’s transparency requirements in practice.
9. Accuracy
Guidelines No. 9 (62 pages) focus on compliance with the AI Act’s accuracy requirements, offering specific measures to implement throughout an AI system’s lifecycle.
10. Robustness
Guidelines No. 10 (73 pages) outline necessary measures to ensure the robustness of high-risk AI systems.
11. Cybersecurity
Guidelines No. 11 (79 pages) list cybersecurity measures and provide practical guidance on their implementation.
12. Record Keeping
Guidelines No. 12 (34 pages) assist AI system providers in fulfilling their record-keeping obligations throughout the system’s lifecycle.
13. Post-Market Monitoring
Guidelines No. 13 (38 pages) offer procedures for post-market monitoring after an AI system is operational.
14. Incident Reporting
Guidelines No. 14 (25 pages) outline the steps to report serious incidents involving high-risk AI systems.
15. Technical Documentation
Guidelines No. 15 (62 pages) detail the content required for the technical documentation of high-risk AI systems.
16. Checklist Manual and Checklists
Guidelines No. 16 (16 pages) include a Checklist Manual with 13 Excel files to help companies document their compliance measures.
These guidelines are a vital resource for companies involved in the production, marketing, and deployment of regulated AI systems, offering a practical tool for documenting and reviewing compliance measures. AESIA has confirmed that these guidelines are under ongoing review and may be amended based on future regulatory changes.