How will AI impact medical device compliance?

How Will AI And Machine Learning Impact Medical Device Compliance?

07-06-2023

Regulations are always playing catch-up, especially when it comes to new technology. Take Artificial Intelligence (AI) and Machine Learning (ML), for example. It’s a rapidly evolving field, one that’s moving so fast that regulatory providers face a real challenge to keep up. In fact, an EU committee has only recently adopted a draft negotiating mandate for the new AI Act. But that’s only one step closer. There’s still a long way to go.

For the medical device industry, AI presents two key challenges. First, how to use AI writing tools to create regulatory documents. Second, how to ensure safe and effective implementation of AI in ‘software as a medical device’ or as part of a physical device. 

Here’s our stance on the current state of play for AI in medical device compliance, and how to use it ethically, safely, and productively. 

AI assistance for regulatory documentation 

AI writing tools have been on the market for some time now – largely designed for correcting grammar and improving clarity. Across many industries, AI chatbots are used to assist human writers, from producing the first draft of a document to summarizing complex articles.

A Notified Body needs the documentation and technical dossier for the EU’s Medical Device Regulation (MDR) to be clear and understandable. If you’re not careful, it can easily become overly technical and difficult to read. It must strike the right balance between technical accuracy and clarity. 

An AI chatbot such as OpenAI’s ChatGPT can support the writing process, helping to summarize and simplify key information. It can act as an editor – checking your text to ensure it’s readable. And it can provide inspiration – offering suggestions to help you overcome writer’s block. It also has value as a research tool – like a search engine – helping to explain unfamiliar terminology or concepts. 

AI-generated documentation is possible, but the output should always be edited by a human. That’s especially true for the complex regulatory documentation needed for a medical device. An AI chatbot can serve up plausible-sounding copy, but it may not be factually accurate, missing important details relevant to your device. It might not understand subtle nuances, which change per project and per person, such as the definition of ‘clinical relevance’. 

The bottom line is this: don’t simply follow AI and abandon your processes. It’s a bit like using satellite navigation. Use the information to help guide you but rely on your own common sense and intelligence for key decision-making. And don’t forget, for some key information elements, you will need to be able to identify the actual data source!

Implementation of AI in software as a medical device

In the EU, when an AI or ML component inside a piece of software has a medical purpose, it becomes subject to the MDR. So a crossover between the MDR and the newly endorsed transparency and risk-management rules for AI systems (EU AI Act) is likely.

Implementing AI within software as a medical device (SaMD) is no easy feat. Under the EU’s MDR, any update to software must be properly documented to ensure there are no new risks, and that new functionality works as intended. 

AI is classified under several subsets, including ML – where a computer learns from a dataset. There’s also automated ML, when the AI continuously learns. Any AI inside software as a medical device, which learns from the behavior of users, will constantly be updating and learning, refining its algorithms. Following existing guidance, that would require constant updates to the documentation – which is not feasible. But left unchecked, this level of freedom could allow AI to rapidly get out of control, and your device out of compliance. 

So there’s a discrepancy between the EU MDR and how AI and ML function (and add value to a product). One solution is to lock down the algorithm for a specific release. That means the algorithm stays the same – it doesn’t learn new things – and the vendor remains in control of its output. However, that defeats the purpose of having AI in a marketed product and the possibility of continuous improvement of the healthcare provided to the patient. The latter can then only be introduced via changes, which are often subject to notification and review by overseeing agencies.

There are already a handful of devices on the EU market with an ML component. By comparison, the US Food and Drug Administration (FDA) has already approved, authorized, or cleared over 500 AI/ML medical devices. This covers standalone software or software as a component of medical hardware. 

What your medical device company can do right now

The EU’s incoming AI Act is still some way off. So what can your medical device company do to prepare?  

First, you need to keep updated with the latest guidance on AI in medical devices. It’s highly likely that the AI Act will value clear user controls and awareness of how to properly use the software. This is a crossover with the MDR, which has a strong emphasis on user instructions and guidance documents.  

Second, compliance with the MDR already covers certain aspects of AI. Think of your risk management systems, verification and validation, clinical evaluation and/or post-market surveillance activities. Use the MDR to your advantage. Getting your documentation right under the MDR will almost certainly be to your advantage when the AI Act takes effect. 

Third, post-market surveillance can help you schedule updates and performance checks on the AI, as you would with any other software component. This helps you stay in control of how it behaves. 

Finally, because AI relies on large amounts of data, the new AI Act is also likely to overlap with the General Data Protection Regulation (GDPR). So, get knowledgeable on this and know your gaps.

The US FDA leads the way

As mentioned, the US FDA is already ahead of the game when it comes to accepting and regulating AI/ML in medical devices. One key takeaway from their current guidance documents for Good Machine Learning Practices (GMLP) is to ensure you keep your training dataset and testing dataset separated for validation purposes, and document how GMLP are followed during development. Also, if you let an AI learn, you need to control the boundaries and keep validating the risks, and ensuring your dataset represents the intended user population.

The US FDA has also proposed a new regulatory framework for the modification of AI/ML-based SaMD. It proposes that manufacturers should deliver a ‘change control plan’ prior to release to the market. This plan pre-defines the types of future device modifications and describes the methodology and validation that will be implemented to ensure continuous safety and effectiveness. It also requires an impact assessment for such planned modifications. Right now, this is the strongest indication of what to expect from upcoming AI regulations also for the EU market    

Need expert help with regulatory compliance for your software as a medical device? Speak to a specialist