With the introduction of the EU AI Act and the Medical Device Regulation (MDR), companies in the MedTech sector are faced with new rules specifically targeting AI-enabled medical devices. This regulation significantly impacts how medical technologies are developed, classified, and brought to market. It is essential for companies to understand how the AI Act and the MDR interact and what this means for their product development and compliance.
The Risk-based Framework of the MDR and the AI Act
The MDR focuses on the safety and performance of medical devices, determining the type of conformity assessment required based on the risk level of the device. AI-enabled medical devices can range from simple software to complex systems that directly influence patient care. The AI Act follows a similar risk-based approach but places additional emphasis on monitoring the performance of AI systems and protecting fundamental rights, such as preventing biases and ensuring transparency. The MDR uses a classification system ranging from Class I for low risks to Class III for the highest risks, while the AI Act employs four risk categories: unacceptable, high, limited, and minimal risk. In both cases, the higher the risk, the stricter the requirements and oversight.
Classification and Conformity Assessment: Dual Rules for AI-enabled Medical Devices
When an AI system is integrated into a medical device, such as software for medical imaging or diagnostic tools, companies must ensure that their product complies with the requirements of both the MDR and the AI Act. Companies must carefully determine whether their product is classified as a medical device under the MDR and whether the AI system meets the AI Act's criteria for risk assessment and classification. This means that each product, depending on its intended use and risk assessment, may follow a different route for conformity assessment, leading to potential dual obligations and additional administrative burdens. Additionally, it must be determined whether the product is a ‘high-risk’ AI system, which brings additional requirements for monitoring, transparency, and post-market surveillance. Important to mention is that products that according to the Classification rules for high-risk AI systems in the AI act, are covered by the MDR are classified as ‘high risk’ AI system.
Ethical and Legal Implications of AI in Medical Devices
Companies must not only ensure technical compliance with both regulations but also address the ethical and legal implications of using AI in medical devices. Specific considerations include:
- Data Integrity: AI systems must be developed with data that is representative, accurate, and free of bias.
- Transparency: It is crucial that AI systems are transparent, allowing healthcare providers and patients to understand how decisions are being made.
- Post-market Surveillance: After the launch of an AI system, it is essential to continue monitoring and adjusting the system to ensure safety and effectiveness.
Integrating the AI Act and MDR into Quality Management Systems
The overlap between the AI Act and the MDR means that companies must adapt their existing quality management systems to integrate the requirements of both regulations. This provides an opportunity to streamline compliance processes. Key components of this process include:
- Risk Management: Integrating both MDR and AI Act requirements into the risk management system.
- Documentation: Developing robust technical documentation that meets both regulations.
- Performance Monitoring: Implementing systematic controls and updates in the post-market process.
Holland Innovative: Navigating the Complex Regulations
At Holland Innovative, we understand that complying with both the MDR and the AI Act can be challenging for MedTech companies, especially when AI systems play a central role in product development. We offer support in:
- Guiding companies in classifying their AI-enabled medical devices and determining the correct risk classification.
- Helping integrate AI Act requirements into existing MDR compliance processes.
- Providing advice on adhering to ethical standards and effectively managing data in AI systems.
With the right knowledge and guidance, companies can ensure they meet the complex requirements of both the MDR and the AI Act, while continuing to innovate in medical technology. Holland Innovative is ready to guide companies through these regulations, helping them not only comply with the law but also contribute to the reliability and ethics of AI-enabled medical devices.