AIB 2025-1 MDCG 2025-6 : A Guiding Document for AI-Powered Medical Devices

AIB 2025-1 MDCG 2025-6 : A Guiding Document for AI-Powered Medical Devices

The AIB 2025-1 MDCG 2025-6 guidance document represents a major milestone for medical device manufacturers across the European Union. It clarifies the interplay between the Medical Device Regulation (MDR), In Vitro Diagnostic Medical Device Regulation (IVDR), and the newly adopted Artificial Intelligence Act (AIA, offering comprehensive direction for compliance in this emerging landscape.

Scope of application and classification

Manufacturers of AI-enabled medical devices must first assess whether their products fall under both the MDR/IVDR and the AIA. If the device includes a safety component subject to third-party (Notified Body) assessment and serves a medical purpose, it is considered a high-risk AI system under the AIA. The AIA’s high-risk classification does not alter the MDR/IVDR risk class but imposes additional obligations.

Classification

Notified Body Involved?

AIA High-Risk (Art. 6(1)) conditions fulfilled?

MDR Class I (non-sterile, non-measuring, nonreusable surgical)

No

No

MDR Class I (sterile, measuring, reusable surgical)

Yes

Yes

MDR Class IIa, IIb, III

Yes

Yes

MDR Annex XVI10

Yes

Yes

IVDR Class A (non-sterile)

No

No

IVDR Class A)

Yes

Yes

IVDR Class B, C, D

Yes

Yes

In-house device according to Art. 5(5) MDR/IVDR

No

No

Dual Compliance Requirements: MDR/IVDR + AIA

For high-risk AI systems, manufacturers must ensure simultaneous compliance with both the MDR/IVDR and the AIA. These regulations are designed to function in a complementary manner. 

A. Quality Management System (QMS)

An existing QMS based on ISO 13485 should be updated to align with AIA requirements. In particular, it must address:

  • Data governance processes;
  • Human oversight mechanisms;
  • AI-specific elements of the software lifecycle. 

B. Risk Management

Risk management should not only address device malfunctions and patient safety but also:

  • Impacts on fundamental rights;
  • Systemic bias risks;
  • Uncertainties related to data quality. 

C. Data Governance

The AIA requires that data be representative, unbiased, error-free, and properly documented. Training, validation, and testing datasets must be clearly separated and individually justified.

D. Technical Documentation

The technical file must include software architecture, algorithmic logic, data sources, performance metrics, and system limitations. AIA also requires documentation of explainability, transparency, and user comprehension.

Performance Testing Requirements for AI-Powered Medical Devices

The EU Artificial Intelligence Act (AIA), in conjunction with the Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR), imposes enhanced requirements for performance testing of AI-based medical devices. The MDCG 2025-6 guidance offers manufacturers, Notified Bodies, and regulators a comprehensive roadmap for this process.

1. Scope and Obligation of Performance Testing

Devices containing Medical Device Artificial Intelligence (MDAI) fall under both MDR/IVDR and AIA:

·       MDR/IVDR: Requires testing for clinical performance, safety, and effectiveness.

·    AIA: Mandates evaluation of accuracy, robustness, transparency, and impact on fundamental rights.

Performance testing under AIA expands beyond traditional software testing to include ethical, social, and legal dimensions.

2. Key Performance Domains

AIA and MDR/IVDR collectively require testing across the following areas:

·       Algorithmic accuracy: Are AI outputs reliable?

·       Stability and robustness: Is performance consistent across varying conditions?

·       Generalizability: Does the model maintain accuracy across diverse datasets?

·   Response time and processing efficiency: Is the system viable for real-time clinical use?

·   Fundamental rights and ethical risk: Could outputs cause discrimination or harm?

3. Quality of Testing Datasets

Datasets must meet the following criteria:

·       Representativeness: Reflect the intended patient population.

·       Impartiality: Avoid gender, ethnic, or age bias.

·       Accuracy: Minimize missing, noisy, or erroneous data. 

·       Transparency: Document the origin, context, and conditions of data collection.

4. Verification and Validation Obligations

MDR/IVDR require verification and validation of software components. AIA further requires:

·       Transparent algorithm design;

·       Explainability of outputs;

·       Human oversight capabilities.

5. Documentation of Performance Testing

·       Test plans and methodologies;

·       Dataset specifications;

·       Performance metrics and thresholds;

·       Identified issues and corrective actions.

·  These documents support both Notified Body assessments and post-market surveillance.

Predetermined Change Control for Dynamic Systems

For adaptive or continuously learning systems, manufacturers must include a change control plan that outlines acceptable modifications. This affects whether re-assessment is needed under the AIA.

AI-based medical device manufacturers must recognize that performance testing is now a cornerstone of both technical and regulatory compliance.

Conformity Assessment Procedure

High-risk AI systems must undergo conformity assessment under both MDR/IVDR and AIA:

  • A single Notified Body may handle both assessments when possible.
  • While documentation may overlap, distinct criteria must be addressed.
  • Annex VII of the AIA outlines specific structure and content requirements for performance and risk reporting.

Clinical Evaluation (MDR) and Performance Evaluation (IVDR)

For AI-based medical devices, clinical evaluation must be conducted in accordance with Article 61 and Annex XIV of the MDR. This process is essential for demonstrating whether the device achieves its intended clinical benefit and maintains an acceptable level of safety for the patient.

For AI systems under the IVDR, a similar approach is taken through a performance evaluation process as defined in Annex XIII of the IVDR.

Key Considerations in Clinical Evaluation:

  • Clinical validity of algorithmic outputs: Are the AI-generated results consistent with real-world clinical decisions?
  • Quality and type of clinical data: Should be supported by observational studies, retrospective analyses, or prospective clinical investigations.
  • Impact of retraining or updates: When the AI model is updated, its clinical validity must be reassessed.
  • Interaction with human oversight: The degree to which human experts rely on or override the AI system must be clearly documented.

AIA’s Impact:

While the AIA does not directly regulate clinical evaluation, its requirements for accuracy, robustness, bias detection, and ethical compliance must be supported by clinical evidence. Therefore, AIA obligations should be integrated into the clinical evaluation framework to ensure full regulatory alignment.

Post-Market Surveillance and Updates

Both MDR/IVDR and AIA require continuous lifecycle monitoring:

  • Track and assess user feedback;
  • Evaluate impacts of software updates;
  • Monitor learning mechanisms in adaptive systems;
  • Apply predefined change plans to mitigate risk from unforeseen updates.

Transparency, Ethical Use, and Training

AIA emphasizes user awareness and control:

  • Instructions for use must be clear and comprehensible;
  • System limitations should be transparently disclosed;
  • User training and decision-making explanations should be provided when necessary.

Timeline and Transitional Provisions

  • The AIA enters phased application in mid-2026 and becomes fully enforceable on August 2, 2027.
  • Manufacturers are advised to prepare their systems, documentation, and QMS for compliance well ahead of full enforcement.
Contact

Questions? Reach out!

 

 

Contact us

Medical Devices

Learn more about our services for the medical devices sector