Integrating Medical Devices with the EU AI Act: Key Regulatory Insights

Medical Devices and the EU AI Act: A Comprehensive Study

The intersection of medical devices and artificial intelligence (AI) is becoming increasingly significant in the healthcare sector. The EU AI Act provides a framework for how these technologies will coexist and be regulated. This study aims to explore how medical devices that integrate AI systems will be governed under the EU AI Act, highlighting key terms and implications for manufacturers and users.

When is a Medical Device Subject to the AI Act?

All medical devices that incorporate an AI system as defined in the AI Act are subject to its provisions when targeted at the EU market. The term “AI System” is broadly defined, potentially including systems that use big data for predictive analytics, even if they do not constitute what is traditionally considered artificial intelligence.

High-Risk and Limited-Risk AI Systems

Medical devices are categorized as either high-risk or limited-risk under the AI Act. High-risk devices face a stringent regulatory regime that requires notified body certification for both the AI system and the medical device itself.

There are two pathways for a medical device to qualify as a high-risk AI system (HRAIS). The first pathway is if the device is subject to a notified body conformity assessment under existing medical device regulations. In this case, the AI system must be either:

  • Intended as a safety component
  • A standalone product, such as software as a medical device (SaMD)

The second pathway includes devices defined as high-risk under Annex III of the AI Act, such as applications that monitor emotional states through facial recognition technology.

In cases where both definitions apply, both high-risk AI system regulations and Annex III requirements may be enforceable, although exemptions may exist for low-risk declarations.

Medical Device Regulation vs. AI Act

In the realm of medical devices, any software that has clinical applications directly impacting individuals is likely to be regulated under the EU medical device regulations. For high-risk AI systems, manufacturers must ensure compliance with both sets of legislation, requiring a unified declaration of conformity (DoC).

Compliance with the EU GDPR is also essential. Providers of HRAIS must declare their compliance with data protection regulations in their DoC, making non-compliance a potential risk to their device’s market validity.

Requirements for Limited-Risk Medical Devices

Manufacturers of limited-risk medical devices are exempt from the complexities of notified body regulation but must still adhere to additional provisions. Starting from February 2, 2025, manufacturers will need to implement training for staff regarding the operation and use of AI systems. They are also encouraged to comply with industry-specific codes of conduct that will be finalized by May 2, 2025.

Moreover, if a device interacts directly with individuals, the manufacturer must inform users that an AI system is involved unless it is evident. This information should be provided at the first point of interaction.

Obligations for Medical Devices Under Annex III

Devices classified as HRAIS under Annex III face specific obligations, including:

  • Informing individuals of the use of HRAIS when decisions affect them.
  • Reporting serious incidents that violate EU obligations to market surveillance authorities.
  • Documenting assessments if a device is deemed not high-risk and registering it in the EU database.

Provider vs. Manufacturer

The terms provider and manufacturer are used interchangeably between the AI Act and medical device regulations, signifying the primary entity responsible for compliance. However, the AI Act lacks equivalent guidelines to the General Safety and Performance Requirements (GSPRs) found in medical device regulations, creating a need for industry-specific guidance.

Quality Management Systems (QMS)

Medical device manufacturers are accustomed to maintaining a quality management system (QMS) that spans the entire lifecycle of their products. The AI Act allows for the integration of QMS compliance for AI systems with existing medical device regulations, albeit with broader compliance requirements focused on regulatory adherence.

Manufacturers of HRAIS must consider detailed processes concerning data handling and system development as part of their QMS.

Clinical Investigations and Performance Studies

Under the EU medical device regulations, exceptions exist for devices used in authorized clinical investigations. However, the AI Act does not provide similar exceptions for HRAIS, limiting testing options to AI regulatory sandboxes established by competent authorities.

Preparing for Compliance

The combined regulatory burdens of the EU Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR) create additional complexities for manufacturers of HRAIS. A thorough mapping exercise is recommended to determine the applicability of the AI Act and to minimize additional compliance work.

As the regulatory landscape evolves, manufacturers should prepare for the additional costs and delays associated with bringing HRAIS to the EU market by the enforcement date of August 2, 2027.

More Insights

Revolutionizing Drone Regulations: The EU AI Act Explained

The EU AI Act represents a significant regulatory framework that aims to address the challenges posed by artificial intelligence technologies in various sectors, including the burgeoning field of...

Revolutionizing Drone Regulations: The EU AI Act Explained

The EU AI Act represents a significant regulatory framework that aims to address the challenges posed by artificial intelligence technologies in various sectors, including the burgeoning field of...

Embracing Responsible AI to Mitigate Legal Risks

Businesses must prioritize responsible AI as a frontline defense against legal, financial, and reputational risks, particularly in understanding data lineage. Ignoring these responsibilities could...

AI Governance: Addressing the Shadow IT Challenge

AI tools are rapidly transforming workplace operations, but much of their adoption is happening without proper oversight, leading to the rise of shadow AI as a security concern. Organizations need to...

EU Delays AI Act Implementation to 2027 Amid Industry Pressure

The EU plans to delay the enforcement of high-risk duties in the AI Act until late 2027, allowing companies more time to comply with the regulations. However, this move has drawn criticism from rights...

White House Challenges GAIN AI Act Amid Nvidia Export Controversy

The White House is pushing back against the bipartisan GAIN AI Act, which aims to prioritize U.S. companies in acquiring advanced AI chips. This resistance reflects a strategic decision to maintain...

Experts Warn of EU AI Act’s Impact on Medtech Innovation

Experts at the 2025 European Digital Technology and Software conference expressed concerns that the EU AI Act could hinder the launch of new medtech products in the European market. They emphasized...

Ethical AI: Transforming Compliance into Innovation

Enterprises are racing to innovate with artificial intelligence, often without the proper compliance measures in place. By embedding privacy and ethics into the development lifecycle, organizations...

AI Hiring Compliance Risks Uncovered

Artificial intelligence is reshaping recruitment, with the percentage of HR leaders using generative AI increasing from 19% to 61% between 2023 and 2025. However, this efficiency comes with legal...