Dive Brief:
- Industry appears broadly pleased with FDA's proposed regulatory framework to enable ongoing artificial intelligence or machine learning algorithm changes to software as a medical device based on real-world learning.
- In comments on FDA's AI/ML April discussion paper, AdvaMed and the Combination Products Coalition (CPC) praised the agency for developing a regulatory approach for the emerging technology, but offered a slew of recommendations for the agency to consider as it works on a potential draft guidance.
- Both groups note that the proposed framework is focused on changes to AI/ML SaMD after being cleared through the 510(k) pathway. AdvaMed and CPC both argue Class III AI/ML should be able to take advantage of the proposed framework.
Dive Insight:
FDA's proposed framework, released shortly before former FDA Commissioner Scott Gottlieb resigned, aims to determine what type of AI/ML-based SaMD modifications could potentially be exempted from premarket submission requirements to enable software algorithms to evolve over time.
"The goal of the framework is to assure that ongoing algorithm changes follow pre-specified performance objectives and change control plans, use a validation process that ensures improvements to the performance, safety and effectiveness of the artificial intelligence software, and includes real-world monitoring of performance once the device is on the market to ensure safety and effectiveness are maintained," Gottlieb said in an April statement announcing the discussion paper.
CPC praised the approach as "well-constructed" but said FDA needs to provide more clarity about when a continuously adaptive AI/ML could require a premarket submission. For example, FDA's previous guidance on when to submit a 510(k) for software changes to existing devices does not directly align with the discussion paper, it said.
"The proposed framework currently addresses only incremental learning algorithms with gating mechanisms for updates," the group wrote. "Given the rapid advances in AI/ML in recent years, and its great potential applications to medicine, the CPC believes a framework for such software will be necessary in the near future."
AdvaMed said FDA is quickly creating a patchwork of policy documents with relevant definitions strewn across multiple guidances, which could lead to confusion.
"To promote global harmonization of definitions and terms, FDA should define AI and ML in accordance with international standards," AdvaMed wrote.
In the discussion paper, FDA said it plans to take a total product life cycle approach to monitoring AI/ML-based SaMD.
"We're considering how an approach that enables the evaluation and monitoring of a software product from its premarket development to post-market performance could provide reasonable assurance of safety and effectiveness and allow the FDA's regulatory oversight to embrace the iterative nature of these artificial intelligence products while ensuring that our standards for safety and effectiveness are maintained," Gottlieb said in April.
AdvaMed suggested user feedback, ratings and usage patterns could help inform real-world performance monitoring of AI/ML-based SaMD but noted that such use should "consider the user's privacy and consent for specific data use."
CPC also urged FDA to clarify if its TPLC approach to real-world performance monitoring of AI/ML-based SaMD would vary from how it plans to monitor products moving through its Software Precertification Program.
"The CPC recommends piloting a Real-World Data Collection Program to demonstrate its benefits relative to the increased regulatory burden borne by both FDA and the SaMD sponsor for mandatory post-market surveillance reporting for all SaMD products," CPC wrote.