March 26, 2026 9 mins read

CE Marking and the EU AI Act: What AI Developers Need to Know

CE marking has governed product safety in the European market for decades. The EU AI Act does not replace it. For AI developers building systems that fall within existing CE marking regimes, the two frameworks now operate in parallel, and understanding where they overlap is a legal and commercial necessity.

This article explains how CE marking works, where the EU AI Act intersects with it, and what AI developers must do to meet both sets of obligations before placing their systems on the EU market.

What CE Marking Is and What It Signals

CE marking is a declaration by a manufacturer or importer that a product meets the applicable EU harmonised legislation before it is placed on the EU market or put into service. It is not a quality mark or a certification issued by a third party in most cases. It is a self-declaration of conformity, backed by a technical file and, where required, a third-party conformity assessment.

The mark covers a wide range of product categories governed by EU directives and regulations, including the Machinery Regulation (EU) 2023/1230, the Medical Devices Regulation (EU) 2017/745, the Radio Equipment Directive 2014/53/EU, and others. Each of these legislative instruments sets out its own essential requirements, conformity assessment procedures, and documentation obligations.

For AI developers, the significance of CE marking arises when an AI system is embedded in or constitutes a product that already falls within one of these existing regimes.

Key Definitions: CE Marking in Europe

TermDefinition
CE MarkingA declaration that a product meets applicable EU harmonised legislation, enabling free movement within the European Economic Area
Harmonised StandardA European standard developed by CEN, CENELEC or ETSI under a mandate from the European Commission, compliance with which confers a presumption of conformity with the corresponding legislative requirements
Notified BodyAn organisation designated by an EU member state to assess whether a product meets certain legal requirements before it can be placed on the EU market
Declaration of ConformityA formal document in which the manufacturer declares that a product meets all applicable EU requirements
High-Risk AI SystemAn AI system listed in Annex III of the EU AI Act, or embedded in a product covered by the safety legislation listed in Annex I of the Act
Technical DocumentationThe set of documents a manufacturer must compile to demonstrate product conformity, covering design, development, risk assessment and testing
ProviderUnder the EU AI Act, the natural or legal person that develops an AI system or has it developed and places it on the market or puts it into service under their own name or trademark
DeployerUnder the EU AI Act, the natural or legal person that uses an AI system under their own authority, other than for personal non-professional use

How the EU AI Act Intersects with CE Marking

The EU AI Act introduces a risk-based classification system for AI systems. Systems classified as high-risk under Annex III, or embedded in products covered by the Union harmonised legislation listed in Annex I, must comply with the requirements set out in Chapter III of the Act before they can be placed on the EU market.

Annex I of the EU AI Act lists the existing EU product safety legislation that already requires CE marking. These include the Machinery Regulation, the Medical Devices Regulation, the In Vitro Diagnostic Medical Devices Regulation (EU) 2017/746, the Radio Equipment Directive, and several others.

Where an AI system is a safety component of a product covered by any of these instruments, or where the AI system itself constitutes such a product, it is classified as high-risk under Article 6(1) of the Act.

This creates a dual compliance obligation. The product must meet the requirements of the relevant sectoral legislation and carry a CE mark. It must also meet the requirements of the EU AI Act applicable to high-risk AI systems. Both sets of obligations must be satisfied before the product is placed on the market.

The Dual Conformity Assessment Problem

For AI developers, the most operationally significant consequence of this overlap is the conformity assessment process. Under many CE marking regimes, a notified body must be involved in assessing conformity before the CE mark can be affixed.

Under the EU AI Act, high-risk AI systems listed in Annex III generally allow for self-assessment unless a sectoral law requires third-party involvement. However, where an AI system falls under Annex I, the conformity assessment procedure of the relevant sectoral legislation applies.

Article 43 of the EU AI Act addresses this directly. Where the applicable Union harmonised legislation listed in Annex I already requires the involvement of a notified body, the same notified body may carry out a conformity assessment covering both the sectoral requirements and the AI Act requirements, provided the notified body has been designated for the relevant legislation.

This is intended to avoid duplication, though in practice the designation of notified bodies under the AI Act is ongoing and developers should verify the current status of their preferred notified body.

Where self-assessment is permitted under both regimes, the developer must still prepare separate technical documentation satisfying the requirements of each framework, prepare a Declaration of Conformity referencing all applicable legislation, and register the system in the EU database for high-risk AI systems established under Article 71 of the Act.

CE Marking Conformity Assessment Routes for AI Developers

ScenarioCE Marking RequirementEU AI Act RequirementNotified Body Required
AI system embedded in medical deviceYes, under MDR 2017/745Yes, Annex I high-riskYes, for both MDR and AI Act (where designated)
AI system embedded in machineryYes, under Machinery Regulation 2023/1230Yes, Annex I high-riskDepends on machinery category
Standalone AI system in Annex III (e.g. CV screening tool)No CE marking requiredYes, Annex III high-riskNo, self-assessment permitted
General purpose AI system with systemic riskNo CE marking requiredYes, Chapter V obligations applyNo, but model evaluation required
Minimal risk AI systemNo CE marking requiredNo mandatory requirementsNo

CE Marking Technical Documentation: What Must Be Prepared

AI developers subject to both CE marking requirements and the EU AI Act must prepare two distinct but related sets of technical documentation.

Under CE marking regimes, technical documentation typically covers the product description and intended use, design and manufacturing drawings, list of applicable harmonised standards, risk assessment, test reports, and the Declaration of Conformity. The precise requirements vary by directive or regulation.

Under Article 11 and Annex IV of the EU AI Act, technical documentation for high-risk AI systems must include a general description of the system and its intended purpose, a description of the elements of the AI system and its development process, information on monitoring, functioning and control, a description of the risk management system, data governance measures, information on human oversight measures, and details of accuracy, robustness and cybersecurity metrics.

These two sets of documentation must be maintained separately but should be prepared in a coordinated manner. Inconsistencies between the two files, particularly in the description of intended purpose and risk, create compliance exposure during market surveillance.

Post-Market Obligations and CE Marking

CE marking does not end at the point of placing a product on the market. Most CE marking regimes require post-market surveillance, incident reporting, and periodic review of technical documentation when the product undergoes substantial modification.

The EU AI Act adds further post-market obligations for high-risk AI systems. Under Article 72, providers must establish and maintain a post-market monitoring system, collect and analyse data on the performance of the AI system throughout its lifetime, and report serious incidents to market surveillance authorities. Under Article 73, providers must notify market surveillance authorities of serious incidents or malfunctions that constitute a breach of obligations under Union law.

For AI developers whose systems are embedded in CE-marked products, these post-market obligations layer on top of existing sectoral reporting requirements. A single incident may trigger notification obligations under both the sectoral legislation and the AI Act, potentially to different authorities and within different timeframes.

Substantial Modification and the Obligation to Re-Assess

One area that creates particular compliance risk for AI developers is the concept of substantial modification. Under CE marking regimes, a substantial modification to a product may require a new conformity assessment and re-affixing of the CE mark.

Under Article 83 of the EU AI Act, substantial modification of a high-risk AI system, including changes that affect its compliance with the Act or changes to its intended purpose, triggers a new conformity assessment obligation.

AI systems present a specific challenge in this context because the performance of a system may change over time as a result of continued learning or updates, even without deliberate modification by the developer. Developers must establish internal processes for monitoring whether updates or changes to their systems constitute a substantial modification under both regimes, and must document the outcome of that assessment as part of their ongoing technical file maintenance.

Key Deadlines for EU AI Act

ObligationDeadline
Prohibited AI practices banned2 February 2025
GPAI model obligations apply2 August 2025
High-risk AI systems (Annex III) obligations apply2 August 2026
High-risk AI systems (Annex I, CE marking products) obligations apply2 August 2027
Obligations for AI systems in existing CE-marked products already on market2 August 2027, with transition provisions

The extended deadline for Annex I products reflects the additional complexity of integrating AI Act requirements into existing CE marking regimes. Developers of AI systems embedded in CE-marked products have additional time, but that time should be used to map existing technical documentation against AI Act requirements rather than treated as a reason to delay preparation.

Frequently Asked Questions

Does CE marking mean an AI system is automatically compliant with the EU AI Act?

No. CE marking under sectoral legislation demonstrates conformity with that legislation only. Compliance with the EU AI Act must be demonstrated separately, through its own conformity assessment process and technical documentation requirements.

If my AI system is not embedded in a CE-marked product, do CE marking obligations apply?

Not directly. If your AI system is a standalone software product that does not fall within the scope of any Union harmonised legislation listed in Annex I of the EU AI Act, CE marking obligations do not apply. Your AI Act obligations depend on the risk classification of your system under Annex III or the GPAI provisions.

Can the same notified body assess conformity under both CE marking legislation and the EU AI Act?

In principle, yes, under Article 43 of the EU AI Act, provided the notified body has been designated for the relevant legislation under both regimes. In practice, AI Act notified body designations are still being processed across member states and developers should verify current designations before proceeding.

What constitutes a substantial modification that triggers re-assessment?

Neither the EU AI Act nor most CE marking regimes provide an exhaustive definition. Under Recital 66 of the AI Act, a change to the intended purpose, a change that affects compliance with the Act’s requirements, or a modification that affects the system’s performance beyond what was anticipated in the original conformity assessment are all indicators. Developers should maintain a documented change log and assess each update against these criteria.

Must the Declaration of Conformity reference both the sectoral legislation and the EU AI Act?

Yes, where both apply. The Declaration of Conformity must list all applicable Union legislation under which conformity is declared. A Declaration referencing only the sectoral legislation, without reference to the EU AI Act, is incomplete for products placed on the market after the relevant AI Act deadline.

Where must high-risk AI systems be registered?

Under Article 71 of the EU AI Act, providers of high-risk AI systems listed in Annex III must register their systems in the EU database before placing them on the market. Systems covered by Annex I legislation that already appear in existing sectoral databases may be subject to modified registration requirements, the details of which are being developed in implementing acts.


This article is for informational purposes and does not constitute legal advice. AI Act implementation details continue to develop through delegated and implementing acts. Developers should seek qualified legal advice on their specific obligations.