top of page

EU AI Act for Medical Devices: Understanding the Linkages Between the AI Act, MDR/IVDR, and IEC 62304

For AI-enabled medical devices, three regulatory frameworks now apply simultaneously. Understanding what each requires, and where they overlap, is the starting point for any compliance strategy.


EU Regulations

If your organization develops or markets AI-enabled medical devices in the European Union, you are now operating under three overlapping regulatory frameworks: the Medical Devices Regulation (MDR 2017/745) or In Vitro Diagnostic Regulation (IVDR 2017/746), the EU AI Act (Regulation (EU) 2024/1689), and the software lifecycle standard IEC 62304. None of these frameworks replaces the others. Each addresses a different dimension of risk, and each imposes its own documentation, process, and conformity assessment obligations.

The critical question for regulatory and quality teams is not simply "what does each framework require?" but rather "where does my existing MDR/IVDR and IEC 62304 compliance already address AI Act obligations, and where does it fall short?" Getting that mapping right determines how much net-new work your organization faces — and where to prioritise it.


This post provides a practical framework for thinking through those linkages, drawing on MDCG 2025-6 — the first official joint guidance from the Medical Device Coordination Group and the AI Board on the interplay between MDR/IVDR and the AI Act, published June 2025.


The Trigger: When Does the EU AI Act Apply?


Not every AI-enabled medical device falls under the AI Act's high-risk framework. The trigger is specific: an AI system is classified as high-risk under Article 6(1) of the AI Act if it meets both of the following conditions:

  • The AI system is the medical device itself, or serves as a safety component of it; and

  • The device requires third-party conformity assessment by a Notified Body under MDR or IVDR.

In practical terms, this means:

Device Classification

Notified Body Required?

AI Act High-Risk?

MDR Class I (non-sterile, non-measuring)

No

No — unless Annex III applies

MDR Class Is / Im (sterile / measuring)

Yes (limited scope)

Yes

MDR Class IIa, IIb, III

Yes

Yes

IVDR Class A non-sterile

No

No

IVDR Class As (sterile)

Yes (sterility only)

Yes

IVDR Class B, C, D

Yes

Yes

An important nuance from MDCG 2025-6: high-risk classification under the AI Act does not change the device's risk class under MDR or IVDR. The two classification systems are independent. A Class IIa device remains Class IIa; the AI Act simply adds a second compliance layer on top.


Key Deadline

The AI Act applies to AI embedded in medical devices under Annex I from August 2, 2027. The Digital Omnibus proposal (currently in EU Parliament as of March 2026) may extend this to August 2028, but it is not yet law.


MDR/IVDR, EU AI Act, and IEC 62304: Three Dimensions of Risk


Before examining the overlaps, it helps to understand what each framework is actually trying to regulate.


MDR / IVDR

The MDR and IVDR govern the safety and performance of the device as a product placed on the EU market. They require risk management (aligned with ISO 14971), clinical or performance evaluation, technical documentation, a quality management system, and post-market surveillance. They address patient safety risks arising from the device's intended use.


EU AI Act

The AI Act is a horizontal regulation — it applies across all sectors, not just medical devices. For high-risk AI systems, it imposes AI-specific obligations that MDR and IVDR do not explicitly address: data governance for training datasets, bias assessment, automatic operational event logging, mandatory human oversight design, transparency to deployers, and fundamental rights impact assessment. It is not a product safety law in the traditional sense; it is a technology governance law.


IEC 62304

IEC 62304 governs the software development lifecycle — planning, requirements, architecture, implementation, verification, maintenance, risk management for software hazards, configuration management, and problem resolution. It is a process standard, not a product standard. It tells you how to build software safely; it does not tell you what the software must disclose to users, how training data must be governed, or how the system must behave to allow human intervention.


"IEC 62304 is necessary but not sufficient for AI Act compliance. It addresses the engineering process. The AI Act addresses the ethics, governance, and transparency of the AI system itself."


Where the Frameworks Overlap — and Where They Don't


MDCG 2025-6 explicitly confirms that a single integrated technical documentation file is permitted, and encourages manufacturers to incorporate AI Act requirements into their existing MDR/IVDR QMS and documentation rather than building parallel systems. Article 11(2) of the AI Act provides the legal basis for this.

The practical question is which AI Act obligations your existing compliance genuinely addresses, and which require new work. Based on a systematic analysis of the AI Act's Articles 9–17, Annex IV, and the MDCG 2025-6 guidance, three categories emerge:

AI Act Requirement

MDR / IVDR

IEC 62304

Net-New Work?

Risk management system (Art. 9)

Explicit

Partial

Extend for AI-specific risks

Training data provenance (Art. 10)

None

None

Yes — entirely new

Bias assessment of training data (Art. 10)

None

None

Yes — entirely new

Integrated technical documentation (Art. 11)

Explicit

Partial

Add AI-specific Annex IV content

AI operational event logging (Art. 12)

None

None

Yes — entirely new

Transparency and IFU (Art. 13)

Explicit

Partial

Add AI-specific content

Human oversight design (Art. 14)

Implicit

Partial

Override mechanisms are new

Cybersecurity (Art. 15)

Explicit

Partial

Adversarial AI testing is new

QMS — AI-specific aspects (Art. 17)

Explicit

Partial

13 new AI QMS elements

Fundamental rights risk assessment (Art. 9)

None

None

Yes — entirely new

Post-market AI performance monitoring (Art. 72)

Explicit

Partial

AI drift monitoring is new

The pattern that emerges is consistent: MDR/IVDR provides an explicit or implicit foundation for most process-level requirements (risk management, technical documentation, QMS, post-market surveillance, cybersecurity, IFU). IEC 62304 provides the software development process foundation. But the AI Act's most novel obligations — data governance, bias assessment, operational logging, fundamental rights risk, and human oversight design — sit entirely outside both existing frameworks. These are the areas that require genuinely new compliance work.


The IEC 62304 Gap in Detail


It is tempting to assume that IEC 62304 compliance — which many manufacturers treat as the primary software quality standard — substantially addresses AI Act requirements. This assumption is worth examining carefully.

IEC 62304 covers software development planning (Cl. 5), risk-related activities for software hazards (Cl. 7), configuration management (Cl. 8), problem resolution (Cl. 9), and software maintenance (Cl. 6). For AI systems, it requires bias documentation as part of the development plan and risk analysis, and allows for AI algorithm-specific verification plans.

What IEC 62304 does not address:

  • Provenance, representativeness, and governance of training, validation, and testing datasets

  • Demographic bias assessment across age, sex, ethnicity, and clinical setting

  • Automatic operational event logging for AI inputs and outputs

  • Explicit human oversight mechanisms — override capability, stop/pause functionality

  • AI-specific transparency disclosures in the IFU (performance metrics, subgroup performance, AI system disclosure)

  • Fundamental rights impact assessment

  • Continuous learning validation and pre-determined change documentation

  • Notified Body accreditation under the AI Act

In our analysis of 56 AI Act compliance requirements, only one — document control and versioning — is fully addressed by IEC 62304. Approximately half are partially addressed, and just over a third represent genuinely new obligations with no IEC 62304 foundation at all. The conclusion is clear: IEC 62304 compliance is a starting point, not a solution.


The Conformity Assessment Question


One of the most practically important clarifications in MDCG 2025-6 is on conformity assessment. For AI-enabled medical devices classified as high-risk under Article 6(1) of the AI Act — i.e. those requiring Notified Body involvement under MDR/IVDR — the conformity assessment route is determined by MDR/IVDR, not the AI Act. A single integrated assessment is permitted, with the Notified Body evaluating compliance against both frameworks in one process.

The operational implication: your Notified Body must hold accreditation under both MDR/IVDR and the AI Act. Not all currently designated Notified Bodies will hold this scope by August 2027. Confirming your Notified Body's AI Act accreditation status — and engaging them early on AI-specific technical documentation expectations — is one of the most time-sensitive actions on the compliance roadmap.


Key Deadlines


Feb 2025 & Aug 2025

Prohibited AI practices and GPAI model obligations already in force.

May 28, 2026

EUDAMED mandatory for EU market access. Separate from AI Act — act now if not registered.

August 2, 2027

Current deadline for AI Act compliance for AI embedded in medical devices (Annex I). Do not plan to the proposed 2028 long-stop — it is not yet law.


Where to Start: A Practical Approach


For most manufacturers, the right starting point is a structured gap assessment that maps current MDR/IVDR technical documentation and IEC 62304 compliance against AI Act Articles 9–17 and Annex IV, using the three-tier MDR/IVDR reference framework from MDCG 2025-6:

  • Inventory your AI portfolio. Identify every AI component across your EU device portfolio, confirm MDR/IVDR risk class, and determine which products meet the Notified Body trigger for AI Act high-risk classification. Class A non-sterile IVDs and self-certifying Class I MDR devices are out of scope.

  • Run a gap assessment against Articles 9–17 and Annex IV. For each requirement, determine whether your existing MDR/IVDR documentation is explicit, implicit, or absent. The net-new obligations — data governance, bias assessment, AI logging, human oversight design, fundamental rights risk — require the most lead time to address.

  • Extend your QMS, not replace it. Article 17 requires 13 AI-specific QMS elements. These should be integrated into your existing ISO 13485 QMS, not built in parallel. The same applies to technical documentation: expand your MDR/IVDR technical file with Annex IV AI content rather than creating a separate document.

  • Confirm your Notified Body's AI Act accreditation. Ask directly whether your NB is designated or actively seeking designation under the AI Act. If not, understand their timeline. NB capacity constraints are already well-documented in the MDR/IVDR space; AI Act accreditation adds another layer.

  • Address pre-determined changes early. If your AI model will be updated post-market, document the scope of anticipated changes and their validation approach at the time of initial conformity assessment. MDCG 2025-6 confirms that pre-determined changes documented in the technical documentation do not constitute a substantial modification — the EU equivalent of the FDA's PCCP concept.

  • Don't wait for the Digital Omnibus. The legislative proposal to delay AI Act high-risk obligations for medical devices to August 2028 has not been enacted as of March 2026. Compliance work you do now — QMS extensions, technical documentation, data governance frameworks — will not be wasted under any regulatory outcome.


Conclusion


The EU AI Act does not replace MDR, IVDR, or IEC 62304. It sits alongside them, adding a new layer of AI-specific governance obligations that existing frameworks were not designed to address. The good news is that the integration pathway is well-defined: MDCG 2025-6 provides a clear framework for incorporating AI Act requirements into existing MDR/IVDR systems, and a single integrated conformity assessment is available for devices where Notified Body involvement is already required.

The challenge is that the most novel AI Act obligations — data governance, bias assessment, operational logging, human oversight design — represent genuinely new work. Manufacturers who approach this as an extension of existing MDR/IVDR compliance, rather than a parallel project, will be better positioned to meet the August 2027 deadline without unnecessary duplication of effort.



Download the Free Gap Assessment Checklist

Quarem Consulting LLC has developed a 56-item gap assessment checklist mapping AI Act Articles 9–17 and Annex IV against MDR/IVDR and IEC 62304 — with explicit, implicit, and no-equivalent tiers for every requirement. It includes a live dashboard that updates as you work through each section.


Comments


bottom of page