The “Authorised Representative” Under the EU AI Act
The duty for market participants established outside of the EU to designate an official “representative” on EU territory is a cornerstone of EU digital and product safety regulations. This requirement appears across multiple frameworks including the GDPR, Digital Services Act, NIS2 Directive, the Data Governance Act, and the Medical Devices Regulation. It mandates non-EU businesses operating in the EU to establish a readily accessible point of contact for European stakeholders.
Following this established pattern, the new European Artificial Intelligence Act (AI Act) also mandates that certain operators based outside the EU must appoint an ‘authorised representative.’
Concerned Operators: Providers of High-Risk AI Systems and General-Purpose AI Models
Under the AI Act, two types of operators based outside the EU have to appoint “authorised representatives”: providers of high-risk AI systems and providers of general-purpose AI models (GPAI models). The first category includes all entities that develop and place on the EU market AI systems that are either:
- Part of a product covered by certain EU legislation, such as the Machinery Directive or the Medical Device Regulation (the complete list of covered legislation can be found in the Act’s Annex I).
- Intended to be used in one of eight designated “high-risk” areas, such as “education and vocational training” or “employment and workers’ management” (the complete list of covered areas can be found in the Act’s Annex III).
GPAI models are defined as AI models that are “trained with a large amount of data using self-supervision at scale, that display significant generality and are capable of competently performing a wide range of distinct tasks and can be integrated into a variety of downstream systems or applications” (Article 3(63)). Large Language Models are prime examples of GPAI models. However, providers that release a GPAI model under a free and open-source license are exempt from the obligation to appoint a representative.
The Set of Duties
Both types of AI providers are subject to a largely identical set of duties (Article 22 and Article 54, respectively). Centrally, they have to appoint, by written mandate, an authorised representative prior to making their system or model available in the EU. This mandate must empower the representative to perform (at least) the following four tasks:
- Verify that the provider has drawn up the necessary technical documentation. According to Article 11, before placing a high-risk system on the market, providers must prepare technical documentation demonstrating compliance, including system training data, architecture, and testing procedures (Article 11, Annex IV). Providers of GPAI models are also required to prepare technical documentation.
- Keep at the disposal of the competent authorities the provider’s contact details, the technical documentation, a copy of the EU declaration of conformity (only for high-risk systems), and, if applicable, an official certificate. These documents must be available for at least 10 years.
- Provide competent authorities, upon a reasoned request, with all the information and documentation necessary to demonstrate the system’s or model’s conformity with the AI Act’s requirements, including the technical documentation and, where applicable, automatically generated logs (Article 21(1)).
- Cooperate with competent authorities in any action they take in relation to the AI system or GPAI model. For GPAI model providers, this duty extends to situations where authorities seek to take action against downstream AI systems integrated with the GPAI model.
In the case of high-risk systems, the representative must also ensure that the system is registered in the respective EU database pursuant to Article 49(1).
Moreover, the mandate must empower the representative to be addressed by the competent authorities on all issues relating to the regulation’s enforcement. Upon request, the representative must be able to provide a copy of the mandate in one of the EU’s languages as indicated by the requesting authority. If the representative believes the provider is acting contrary to its obligations under the AI Act, they must terminate the mandate and immediately inform the relevant market surveillance authority.
Sanctions in the Case of Non-Compliance
The failure to appoint a representative constitutes a case of “formal non-compliance” under Article 83(1), which, if unaddressed, shall lead competent authorities to restrict or prohibit the concerned system (Article 83(1)) or GPAI model (Article 93(1)). It may also incur administrative fines of up to EUR 15M or 3% of total worldwide annual turnover (Article 99(4)(b)). Notably, representatives themselves may also be subject to fines as they are included under the term “operator” pursuant to Article 3(5).
Summary and Advice
The obligation to appoint an authorised representative under the AI Act represents a significant compliance requirement for non-EU providers of high-risk AI systems and GPAI models. With severe penalties for non-compliance and the Act’s broad territorial scope, companies providing AI systems or models that may affect EU users should assess their obligations and establish proper representation. These requirements are set to come into force in August 2026.