The EU AI Act introduces a mandatory requirement for certain non-EU providers to appoint an authorised representative established within the European Union. This requirement follows a well-established pattern across EU digital and product safety law, appearing in the GDPR, the Digital Services Act, the NIS2 Directive, the Data Governance Act, and the Medical Devices Regulation. The AI Act extends this logic to AI systems and general-purpose AI models.
For non-EU businesses with AI products deployed in the European market, the authorised representative obligation is a legal gateway to market access, carrying direct regulatory consequences for non-compliance.
Key Definitions
Authorised representative (Article 3(5), Regulation (EU) 2024/1689): A natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to perform and carry out on its behalf the obligations and procedures established by the Regulation.
High-risk AI system: An AI system falling within one of the categories listed in Annex III of the AI Act (such as biometric identification, employment, education, or law enforcement), or constituting a safety component of a product covered by EU harmonisation legislation listed in Annex I.
General-purpose AI model (GPAI model): An AI model trained on large amounts of data using self-supervision at scale that displays significant generality and is capable of competently performing a wide range of distinct tasks.
GPAI model with systemic risk: A GPAI model classified as posing systemic risk, typically by reason of high-impact capabilities assessed by the European AI Office, including models trained using compute exceeding 10^25 floating-point operations.
Mandate: The written instrument by which a provider confers authority on the authorised representative. The mandate must specify the tasks the representative is empowered to carry out and must be made available to competent authorities upon request in an official EU language.
Who Must Appoint an Authorised Representative
Under the AI Act, two categories of operators established outside the EU must appoint authorised representatives: providers of high-risk AI systems and providers of general-purpose AI models.
The table below sets out the scope of this obligation by operator type.
| Operator type | Legal basis | Trigger for obligation |
|---|---|---|
| Provider of high-risk AI system | Article 22 | System placed on or made available in the EU market |
| Provider of GPAI model (standard) | Article 54 | Model placed on the EU market |
| Provider of GPAI model with systemic risk | Article 54 | Model placed on the EU market |
| Provider of open-source GPAI model | Article 54 | Exempt unless classified as systemic risk |
| EU-established provider | Articles 22 and 54 | No obligation (EU establishment suffices) |
Open-source GPAI models are exempt from the authorised representative requirement, provided the provider publicly discloses the information required under the AI Act. This exemption does not apply where the model is classified as a general-purpose AI model with systemic risk.
Authorised Representatives Under EU AI Act: Application Timeline
The AI Act sets out a phased timeline for appointing an authorised representative, depending on the type of AI:
- 2 August 2025: Providers of GPAI models placed on the market from this date must appoint an authorised representative prior to placing the model on the EU market.
- 2 August 2026: Providers of high-risk AI systems must appoint an authorised representative prior to making the system available on the EU market.
- 2 August 2027: Providers of GPAI models placed on the market before 2 August 2025 must comply.
The appointment must precede market placement. Retroactive appointment does not cure the breach.
Obligations of the Authorised Representative
The specific obligations differ depending on whether the representative acts for a high-risk AI system provider or a GPAI model provider.
For Providers of High-Risk AI Systems (Article 22)
The authorised representative must maintain records for 10 years, including contact details of the provider, a copy of the EU Declaration of Conformity, technical documentation, and any certification issued by the national authority. Upon reasoned request, the representative must provide to a national competent authority all documentation necessary to demonstrate compliance, including access to logs automatically generated by the high-risk AI system where those logs are under the control of the provider. The representative must cooperate with competent authorities to reduce and mitigate risks posed by the system, and assist in registration of the system under Article 49.
The mandate must empower the representative to be addressed by competent authorities in addition to, or instead of, the provider on all compliance matters.
For Providers of GPAI Models (Article 54)
The authorised representative of a provider of a GPAI model must: provide a copy of the mandate to the AI Office upon request; verify that the technical documentation is prepared and that compliance with obligations is fulfilled; keep a copy of the technical documentation for a period of 10 years after the GPAI model has been placed on the market, along with the contact details of the provider; provide the AI Office with all necessary information to demonstrate compliance upon a reasoned request; and cooperate with the AI Office and competent authorities in any action they take concerning the GPAI model.
The table below summarises the key obligations across both roles.
| Obligation | High-risk AI (Art. 22) | GPAI model (Art. 54) |
|---|---|---|
| Hold technical documentation for 10 years | Yes | Yes |
| Verify technical documentation prepared correctly | Yes | Yes |
| Provide documents to authorities on request | Yes (market surveillance authorities) | Yes (AI Office) |
| Cooperate with competent authorities | Yes | Yes |
| Assist with Article 49 registration | Yes | Not applicable |
| Provide copy of mandate upon request | Yes | Yes |
| Receive addresses from authorities instead of or alongside provider | Yes | Yes |
The Written Mandate: What It Must Cover
The mandate is a binding legal instrument governing the relationship between a non-EU provider and its EU representative. A well-drafted mandate should specify:
- The identity of the provider and the representative, with registered addresses;
- The scope of AI systems or GPAI models to which the mandate applies;
- Which Article 22 or Article 54 tasks the representative is authorised to perform;
- The languages in which the mandate can be produced to authorities;
- The data retention arrangements;
- The procedure for termination and notification to authorities;
- Governing law and jurisdiction;
- Obligations on the provider to keep the representative informed of any changes to the AI system or model and of any regulatory correspondence received.
The mandate must be produced to market surveillance authorities or the AI Office upon request. A provider cannot instruct an authorised representative to withhold documents from regulators.
Authorised Representatives Under EU AI Act: Termination of the Mandate
The authorised representative must terminate the mandate if it considers, or has reason to consider, that the provider is acting contrary to its obligations under the Regulation. Upon termination, the representative must immediately inform the relevant market surveillance authority and, where applicable, the relevant notified body, of the reasons for termination.
This termination mechanism carries significant practical consequences. A representative who becomes aware of non-compliance and fails to terminate the mandate and notify the relevant authority may itself face regulatory scrutiny. Providers should treat the authorised representative relationship as requiring active cooperation and ongoing information-sharing, not passive administration.
Penalties for Non-Compliance
Failure to comply with the obligations of authorised representatives under Article 22 constitutes an infringement subject to administrative fines of up to EUR 15,000,000 or, for an undertaking, up to 3% of total worldwide annual turnover for the preceding financial year, whichever is higher.
Supplying incorrect, incomplete, or misleading information to national authorities or notified bodies is subject to fines of up to EUR 7,500,000 or 1% of total worldwide annual turnover, whichever is higher.
For SMEs, including start-ups, each fine is capped at the lower of the applicable percentage or fixed amount.
| Infringement | Maximum fine (undertaking) |
|---|---|
| Failure to comply with Art. 22 obligations (authorised representative) | EUR 15,000,000 or 3% of worldwide annual turnover |
| Supplying incorrect or misleading information to authorities | EUR 7,500,000 or 1% of worldwide annual turnover |
| Violations of prohibited AI practice rules (Art. 5) | EUR 35,000,000 or 7% of worldwide annual turnover |
Choosing an Authorised Representative: Practical Considerations
Appointing a law firm, a compliance service provider, or a local subsidiary as authorised representative each carries different implications. Several factors should be assessed before appointment.
Legal establishment: The representative must be established or located in the EU. Post-Brexit, UK-based entities do not qualify.
Competence: The representative must be capable of understanding and verifying technical documentation and engaging meaningfully with regulators. A representative without sufficient technical and legal knowledge cannot perform the verification obligations the law requires.
Scope of mandate: A single representative can cover multiple AI systems or GPAI models from the same provider, provided the mandate is appropriately drafted.
Regulatory exposure: The representative’s obligation to terminate the mandate and notify authorities means it carries genuine regulatory risk. Professional indemnity insurance and clearly drafted liability provisions in the mandate are advisable.
Record-keeping infrastructure: The 10-year retention obligation requires the representative to maintain a document management system capable of producing technical documentation and declarations of conformity on demand.
How eyreACT Can Help
eyreACT is an EU AI Act compliance automation platform built by European regulatory lawyers with direct experience of EU policymaking. The platform automates the classification, documentation, and ongoing monitoring obligations that the authorised representative role depends on, including technical documentation management, evidence binder generation, and audit-ready record-keeping to the 10-year retention standard.
Most importantly, we now offer a complete Authorised Representative programme for UK, US, Canadian, and other non-EU AI providers so they can trade on EU markets safely and fully compliant with EU AI Act.
Interested to join? Non-EU providers preparing for the August 2026 high-risk AI deadline, or GPAI model providers already subject to obligations from August 2025, can request a free pilot. The pilot includes a full AI system classification and risk assessment, a Compliance Binder for one product, and a custom rule engine configured for your AI domain.
Frequently Asked Questions
Does a UK company providing AI systems to EU customers need to appoint an authorised representative?
Yes, if the AI system is high-risk or the company provides a GPAI model placed on the EU market. The United Kingdom is a third country for the purposes of the AI Act. UK establishment does not satisfy the EU establishment requirement.
Can an importer act as the authorised representative?
The AI Act treats authorised representatives as a distinct category from importers. An importer may also serve as the authorised representative if it accepts the written mandate and meets the relevant obligations, but the two roles are legally separate.
Does the authorised representative share liability with the provider?
The AI Act imposes the Article 22 or Article 54 obligations on the authorised representative directly. The representative is subject to penalties in its own right for failure to perform its obligations, including verification and document retention. Liability between provider and representative in private law will depend on the contractual terms of the mandate.
Can a provider change its authorised representative?
Yes. The provider must ensure continuity of coverage. The obligation attaches prior to market placement, and there must be no gap in the representative appointment while the AI system or model remains on the EU market.
What happens when the authorised representative terminates the mandate?
Upon termination, the representative must immediately inform the appropriate national authority and, where applicable, the notified body, explaining the reasons for termination. The provider must appoint a replacement before continuing to make the AI system available on the EU market.
What languages must the mandate be produced in?
The representative must provide a copy of the mandate to market surveillance authorities upon request, in one of the official languages of the institutions of the Union, as indicated by the competent authority.
Are open-source GPAI models always exempt?
Open-source general-purpose AI models are exempt from the authorised representative requirement unless they are classified as general-purpose AI models with systemic risks.
Summary Checklist: Authorised Representatives Under EU AI Act for Non-EU Providers
| Step | High-risk AI systems | GPAI models |
|---|---|---|
| Confirm system/model type and applicable deadline | Yes | Yes |
| Identify and appoint EU-established representative by written mandate | Before 2 August 2026 | Before 2 August 2025 (new models) |
| Draft mandate covering all Article 22 / Article 54 tasks | Yes | Yes |
| Provide representative access to technical documentation | Yes | Yes |
| Establish 10-year document retention arrangement | Yes | Yes |
| Ensure representative can engage in the required EU official language | Yes | Yes |
| Put in place a process for notifying the representative of system changes | Yes | Yes |
| Plan for mandate termination and replacement procedure | Yes | Yes |
This post sets out the law as it stands in March 2026. It does not constitute legal advice. Providers of AI systems and GPAI models should obtain specific legal advice tailored to their products, deployment arrangements, and market strategy.