EU AI Act: Key Implications for Healthcare and MedTech

A new paper titled “The AI Act: responsibilities and obligations for healthcare professionals and organizations,” published in Diagnostic and Interventional Radiology, offers a detailed analysis of the European Union’s Artificial Intelligence Act (AI Act) that came into force on 1 August 2024. This paper is co-authored by Leon Doorn, co-founder and CEO of MedQAIR, along with Kicky van Leeuwen (Romion Health) and Erik Gelderblom (Radboudumc).

The paper explains that the AI Act establishes the world’s first comprehensive rules for AI ensuring the safe and ethical use of AI systems across sectors, with significant implications for healthcare. In practice, most AI-enabled medical devices will be classed as “high‑risk”, triggering strict new requirements for design, testing, transparency, and oversight.

Unlike previous regulations such as the Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR), which focus primarily on manufacturers, the AI Act extends responsibilities beyond manufacturers (“providers”) to healthcare deployers of AI systems (hospitals, clinics, etc.). It covers all healthcare organisations and professionals who deploy AI-enabled medical devices.

Responsibilities for Healthcare Deployers

A distinct focus of the paper is on the novel obligations placed on healthcare deployers: organisations and professionals who implement AI systems in clinical settings.

Key responsibilities include:

  • AI Literacy and Training: Healthcare staff must be educated about AI capabilities, limitations, and risks to interpret AI outputs correctly and identify potential malfunctions or biases. (Article 4)
  • Logging and Record-Keeping: Healthcare organisations are responsible for maintaining detailed logs of AI system activity for at least six months to facilitate audits, traceability, and incident investigations. (Article 26(6))
  • Human Oversight: The Act requires clear procedures enabling healthcare professionals to maintain control over AI decisions, including the ability to override or disregard AI outputs when necessary to protect patient safety. (Articles 26(2), 26(5))
  • Data Quality Assurance: Deployers must ensure that input data meets the AI system’s specifications, as poor data quality can lead to inaccurate or unsafe AI recommendations. (Article 26(4))
  • Transparency to Users: Both clinicians and patients must be informed when AI is involved in diagnosis or treatment decisions, with clear communication about the system’s role and limitations. (Article 50)
  • Management of In-House AI Systems: For AI tools developed and used exclusively within healthcare organisations, the paper notes possible exemptions from third-party conformity assessments, but stresses that such systems must still adhere to high-risk AI principles, including risk and quality management.

What About Administrative AI and General-Purpose Models?

AI tools used purely for administrative tasks like note-taking, summarisation, or report generation fall under a different category. If they have no direct medical purpose, they are generally considered minimal-risk under the AI Act. In such cases, deployers (e.g., hospitals) do not face specific obligations.
However, the providers of general-purpose AI systems (such as those built on large language models) must still meet requirements related to effectiveness, reliability, transparency, and model evaluation, as set out in Article 50 of the AI Act.

Conclusion

This insightful commentary provides a timely and practical guide for healthcare professionals and organisations navigating the new regulatory landscape under the EU AI Act.

The paper emphasises that the AI Act reshapes accountability by extending obligations to both AI providers and healthcare deployers. It calls for collaboration between these parties to ensure safe, ethical, and effective AI use in clinical practice, balancing innovation with patient protection.

Adapting to the AI Act need not be done alone. MedQAIR’s regulatory team (including experts like Leon Doorn) can help interpret these requirements and update your processes. We welcome you to connect with us for guidance on AI Act preparation and other evolving EU regulations.

Latest Blogs

Unlock Your Quick Guide to AI Act Compliance!

Explore AI-enabled SaMD requirements with our easy step-by-step guide.

Get Your Free eBook

Cookies help us improve your experience on our website. By using our site, you consent to the use of cookies as described in this policy.