Dive Brief:
- FDA on Tuesday released an action plan for establishing a regulatory approach to the fast-developing field of artificial intelligence and machine learning-based Software as a Medical Device (SaMD).
- The action plan comes in response to substantial stakeholder feedback, including hundreds of public comments, on an April 2019 discussion paper that proposed a framework for regulating modifications to AI and SaMD.
- FDA's latest publication on the subject is the next step on a path to draft guidance for a "predetermined change control plan" that would include the types of modifications covered and the methodology used to implement the changes in a way that manages risks to patients, the agency said.
Dive Insight:
The action plan comes amid calls for regulatory clarity from AdvaMed and others on machine learning algorithms that continually evolve without the need for manual updates. FDA so far has approved or cleared only devices that use "locked" algorithms that do not change in this way.
GE Healthcare, Medtronic and Philips are among those working toward incorporating AI capabilities in their treatments and investing in the tech. Medtronic, for example, is focused on AI-aided technologies that would support robotics, navigation, imaging and pre-operative planning for spine surgery. The medtech giant in November acquired French spinal surgery company Medicrea, gaining access to an AI database of more than 5,000 surgical cases.
And last month, Philips announced a $2.8 billion deal to buy Bio Telemetry, which specializes in remote cardiac diagnostics and monitoring, including wearable heart monitors and AI-based data analytics.
Since the release of FDA's discussion paper nearly two years ago, industry has been waiting for an update on its oversight plan. Given that FDA has proposed a total product lifecycle approach to AI/machine learning that could allow improvements to devices after they are in use, finalized guidance or regulations would help developers better understand requirements and potential liabilities, a GAO/National Academy report said in November.
In the new action plan document, FDA touted the ability to learn from real-world experience to improve device performance as one of the greatest benefits of AI/ML in software. It suggested a "predetermined change control plan" could enable evaluation and monitoring of software from premarket development through postmarket performance.
The approach has attracted strong interest since it was described in the 2019 discussion paper, FDA said. The agency noted that Caption Health, which received marketing authorization in February for the first cardiac ultrasound software with AI to guide users, used a predetermined change control plan.
"This framework would enable FDA to provide a reasonable assurance of safety and effectiveness while embracing the iterative improvement power of artificial intelligence and machine learning-based software as a medical device," the regulator said.
FDA said it will issue draft guidance on this concept in response to stakeholder suggestions. The draft will propose what should be included in SaMD pre-specifications and the algorithm change protocol to support the safety and effectiveness of AI/ML SaMD algorithms.
The agency also addressed stakeholder concerns about algorithmic bias, saying it will support regulatory science to develop methodology to evaluate and improve machine learning algorithms. Such methodology would target identification and elimination of bias, and evaluation and promotion of algorithm robustness.
"Healthcare delivery is known to vary by factors such as race, ethnicity, and socio-economic status; therefore, it is possible that biases present in our health care system may be inadvertently introduced into the algorithms," the agency said. FDA "recognizes the crucial importance for medical devices to be well suited for a racially and ethnically diverse intended patient population and the need for improved methodologies for the identification and improvement of machine learning algorithms," the action plan stated.
FDA pledged to encourage harmonization of good machine learning practices and to hold a public workshop on labeling to support transparency regarding AI/ML-based devices.
The agency also said it will work with stakeholders on a voluntary basis to pilot collection and monitoring of real-world performance data, to support the total product lifecycle approach to oversight of AI/ML-based SaMD.