AI as a Medical Device: Regulatory Considerations
Artificial intelligence (AI) is revolutionizing the healthcare industry by introducing innovative solutions in diagnostics, treatment, and patient care. However, the rapid integration of AI into medical devices presents significant regulatory challenges.
Key Regulatory Frameworks
In Europe, the EU AI Act 2024/1689 and the UK AI Roadmap for Medical Devices are key frameworks shaping the future of AI in healthcare. Additionally, the standard IEC 62304—“Medical device software – software life cycle processes”—plays a crucial role in ensuring the safety and reliability of software used in medical devices.
The EU AI Act and Medical Devices
The AI Act is a legal framework created for the development, deployment, and use of AI within the European Union (EU). It provides legal certainty and ensures the protection of fundamental rights. The AI Act promotes the development and innovation of safe and trustworthy AI across both private and public sectors in the EU.
One innovative aspect of the AI Act is the encouragement of regulatory sandboxes that allow for a controlled environment for development, validation, and testing in real-world conditions.
Applicability of the AI Act
The AI Act applies to providers, deployers, and economic operators of AI within and supplying into the EU. It does not apply to the UK or US markets, making it essential to check specific AI legislation or rules for each region during regulatory planning.
Classification of AI Medical Devices
All AI devices are classified based on their risk levels:
- Unacceptable risk: Prohibited (e.g., social scoring systems).
- High risk AI: Most regulated (e.g., biometrics, critical infrastructure, medical devices).
- Limited risk AI: Subject to lighter transparency obligations (e.g., chatbots, deepfakes).
- Minimal Risk: Unregulated (e.g., AI-enabled video games).
The AI Act is horizontal legislation, meaning it will be used in addition to the EU Medical Device Regulation (MDR) 2017/745. Both regulations must be considered for medical devices that contain an AI function.
General Purpose AI (GPAI)
GPAI refers to AI systems based on general-purpose AI models that can serve a variety of purposes. The first General-Purpose AI Code of Practice will detail the AI Act rules for providers of these models, including documentation and compliance with the Copyright Directive.
UK Software and AI as a Medical Device Change Programme
The UK AI Roadmap for Medical Devices outlines the government’s strategy for fostering innovation while ensuring patient safety. Key elements include:
- Regulatory Sandboxes: Establishing environments for testing AI medical devices.
- Inclusive Innovation: Ensuring software functions effectively across diverse populations.
- Adaptive Regulatory Approach: Creating a flexible framework to adapt to rapid AI advancements.
- Collaboration with Industry: Working closely with stakeholders to develop guidelines.
IEC 62304 and Its Role
The IEC 62304 standard is critical for ensuring the safety and reliability of software in medical devices, including AI-driven systems. Key aspects include software safety classification, lifecycle processes, and risk management. The standard is undergoing updates to enhance its applicability to new health technologies.
Linking Regulations and Standards
All regulations and standards mentioned are intertwined, addressing the regulatory and technical challenges of AI in medical devices. The landscape for AI in medical devices is evolving, and staying updated with emerging standards is crucial for compliance and innovation.
Conclusion
There is an opportunity and responsibility to shape the future of AI in medical devices. By prioritizing ethical practices, collaboration, and a focus on patient safety, stakeholders can ensure that AI technologies advance healthcare and earn the trust of those who depend on them.