
The EU’s AI Act And Medical Devices – What We Know So Far
AI is reshaping the compliance burden for medical device companies in the EU. Thankfully, for makers of medical devices containing AI, the Medical Device Coordination Group (MDCG) has released a crucial piece of guidance material.
MDCG 2025-6 outlines the interplay between the EU’s Medical Device Regulation (MDR), In Vitro Diagnostic Medical Device Regulation (IVDR), and the Artificial Intelligence Act (AI Act). It comes just in time: the full application of the AI Act to high-risk systems, including medical devices, is set to take effect one year from now.
We sat down with Dawn Technology (formerly Peercode) regulatory consultant Marijn Maas to unpack the implications of this landmark guidance and what vendors of medical device software can do to prepare.
The current state of play for AI and medtech
With less than a year to go, manufacturers of medical devices were in need of further clarity around AI and a clear path to compliance. Now they’ve got it. As Marijn notes, “One of the larger medtech stakeholder groups recently called for greater alignment between MDR and the AI Act. Many of the questions they raised are addressed by MDCG 2025-6.”
The FAQ on Interplay between the Medical Devices Regulation and In vitro Diagnostic Medical Devices Regulation and the Artificial Intelligence Act (June 2025) is the first joint guidance between the MDCG and the newly established Artificial Intelligence Board (AIB). It sets out a foundational FAQ for how AI and the MDR intersect.
When does the AI Act apply to medical devices?
The AI Act is partially in force already, applying to general-purpose AI (GPAI) platforms such as ChatGPT and other foundation models. But for high-risk AI systems, like those used in medical devices, the real regulatory impact begins from August 2, 2026. “From that date,” Marijn explains, “Software as a medical device and medical devices with an AI (safety) component will need to comply with both the MDR and the AI Act.”
But here’s the thing: at the time of writing, there aren’t any accredited Notified Bodies to certify AI systems under the AI Act. On top of that, existing Notified Bodies may not be seeking AI designation. This scenario looks likely to create a bottleneck of applications similar to the early days of the MDR.
5 key takeaways from MDCG 2025-6
So, what does the new guidance actually say? Marijn outlines five key points:
1. Clarification of scope
The AI Act applies only to AI systems used in medical devices that fall under Notified Body surveillance (i.e. Class IIa and above). Class I SaMDs with AI components are excluded. Also exempt are tools developed in-house and used solely within a healthcare institution.
2. Quality and data management
The AI Act imposes obligations around dataset completeness, representativeness, and bias mitigation. “One of the biggest messages here is that a poor dataset isn’t just a performance issue – it’s a compliance risk,” says Marijn.
Manufacturers must integrate new processes for training, validating, and testing AI models, ensuring robust data governance and traceability. These can be embedded into the existing ISO 13485 Quality Management System (QMS) rather than requiring a separate AI management system.
3. Transparency and human oversight
Manufacturers must ensure that the end users (e.g. clinicians and patients) are fully aware they’re interacting with AI, understand how it works, and know its limitations. “Transparency is no longer optional,” Marijn emphasizes. “It’s a core requirement.”
As well as that, it’s also necessary to test for interpretability, usability, and AI literacy, educating users on the AI model’s capabilities and risks.
4. Postmarket monitoring of AI behavior
Dedicated postmarket monitoring guidance, potentially including a template, is expected in early 2026. What we currently know is that the AI Act expects you to detect ‘learning drift’ and log AI behavior in detail, so you can investigate and adjust as needed.
Manufacturers will need to implement a robust monitoring system to ensure their AI systems continue to operate within predefined boundaries. If the AI attempts to move beyond those boundaries, a decision must be made.
Marijn explains it using an analogy: “Your AI is like a farm animal within a defined fenced field. It might notice a food source just outside the fence and want to go there. You need an adequate system for detecting when this happens. You then have to make a decision: keep the fence in place or change the boundary?”
5. A single QMS for both the MDR and AI Act
Some of the best news to come from the new guidance? A much welcomed simplification: you won’t need two separate quality systems. “You’ll be able to add AI-specific requirements into your existing ISO 13485 QMS,” Marijn confirms, “which will make life that bit easier.”
Notified Body scarcity and certification timing
Despite delivering much-needed clarity, the new guidance raises questions about some of the practical challenges. Marijn warns of a ‘chicken-and-egg’ scenario: even if you’re fully compliant by 2026, certification may get held up due to the lack of Notified Bodies available to certify you under the AI Act.
He also identifies another potential hurdle: “If you’re already working with a Notified Body as part of MDR compliance, there’s no guarantee they’ll get accredited for AI certification.”
The short-term answer? Proactively engage with your current Notified Body to determine their future plans rather than facing the increased cost and complexity involved with working with two separate Notified Bodies.
The compliance window is tight – act now!
While 2026 might seem a long way off, Marijn makes an important point: “In regulatory terms, one year is not a lot of time”. Preparing your documentation, updating your QMS, conducting risk analyses, and ensuring traceability of your AI system can be a lengthy process.
Something to note: AI-powered medical devices on the market before August 2, 2026, are not required to retrofit AI Act compliance unless they undergo significant changes. But any new units placed on the market from that date must meet the AI Act’s standards.
How can Dawn Technology help?
Here at Dawn Technology, Marijn and the rest of our team offer hands-on, expert support to get you ready for the MDR, IVDR, and the AI Act:
- Gap assessments to identify missing elements in your QMS
- Development of SOPs and AI-specific risk analysis
- Guidance on AI dataset governance, bias mitigation, and model validation
- Strategic support for postmarket monitoring plans
- Ongoing regulatory intelligence tracking, so you stay aligned with future updates
“We're all learning as we go,” says Marijn. “But early engagement means you’re not scrambling when the deadlines hit. Delegating your AI compliance workload lets you focus on innovation.”
Do prepare and don’t panic
The AI Act represents a turning point for AI in the medical device space, introducing challenges, yes, but also much-needed structure and safety guardrails. By understanding the requirements now and adapting early, medical device manufacturers can gain a strategic advantage. Dawn Technology Regulatory Consultancy is here to help you stay ahead of the curve. Book a consultation with one of our experts today.
Read the full guidance from the Medical Device Guidance Committee and AI Board here