European Doctors call on the EU to maintain medical devices within the scope of the AI Act
CPME President Dr Ole Johan Bakke said “Essential safeguards included in the AI Act are in danger of being removed, such as human oversight on high-risk AI systems used in medical devices. This could erode the trust of doctors, which may hinder the uptake of AI in healthcare.
"We must also ensure that certified medical devices address AI-specific risks, such as those related to machine learning, AI agents, bias mitigation, model performance degradation, among many others. To trust products used when providing care to patients, we need the AI Act to continue to be applied as a complement to the Medical Devices Regulation (MDR).”
Prof. Dr Christian Lovis, CPME rapporteur on AI, said “AI tools, such as large language models, can be constantly trained, tuned and influenced by prompting. When as agentic agents, they can influence autonomously care processes. This is different than production of a drug or an industrial process of a medical device that is deterministic and reproducible. That is why the requirements for high-risk AI systems must continue to apply to medical devices and in vitro medical devices.”
CPME Vice President Dr Jacqueline Rossant-Lumbroso added “The AI Act improves clarity and consistency of AI-specific expectations. It explicitly delineates what notified bodies should look at to certify a high-risk AI device, ensuring a harmonised and safer approach. The AI Act brings legal certainty not only for manufacturers, but also for deployers, such as healthcare providers, who bear responsibilities for the use of high-risk AI applications.
“The AI Act will ensure consistent application across the EU, avoiding Member States legislating in this area. The requirements for high-risk AI integrate into the medical devices’ conformity assessment, and guidance[1] should continue to be published on the interplay between the AI Act and the Medical Devices Regulation supporting implementation while maintaining safety standards.”
CPME co-signs a joint open letter with HOPE calling for a safe, transparent, and accountable AI for Medical Devices and In Vitro Medical Devices.
The Standing Committee of European Doctors (CPME) represents national medical associations across Europe. We are committed to contributing the medical profession’s point of view to EU and European policy-making through pro-active cooperation on a wide range of health and healthcare related issues.
[1] See in this sense the guidance endorsed by the Artificial Intelligence Board (AIB) and the Medical Device Coordination Group (MDCG) here: https://health.ec.europa.eu/document/download/b78a17d7-e3cd-4943-851d-e02a2f22bbb4_en.