Loading...

News Center

Home / News

US: FDA officials say different communication methods needed to ensure AI/ML transparency

2024/02/07  RAPS

More than two years after holding a public workshop on transparency in the artificial intelligence/machine learning (AI/ML) space, US Food and Drug Administration (FDA) officials have published an article in Nature magazine calling for a new communication approaches to ensure transparency for such products.

On 26 January, officials at the Center for Devices and Radiological Health (CDRH), including those from its Digital Health Center of Excellence (DHCoE), published their key takeaways from an October 2021 workshop where a broad range of stakeholders discussed the importance of AI/ML products.

The officials noted that the workshop participants generally agreed with the definition of transparency as the degree to which key aspects of an AI/ML device are clearly communicated to stakeholders. That includes factors such as the product’s intended use, development, performance, and logic.

While CDRH tries to ensure there is transparency about products by providing product safety and effectiveness information in online databases, letters to healthcare providers, safety communications, and guidance documents, the information is often catered to industry and hard for other stakeholders to understand, according to the officials.

“The substantial amount of information available in these resources has the potential to better inform users on how a device might impact patients,” wrote the authors. “However, workshop participants suggested that the delivery of this information, as well as the level of detail available, may not be sufficient to enhance stakeholder knowledge or their ability to make informed decisions.”

“For these FDA communications and documents, one challenge is that much of the device information available on the CDRH website is developed by or geared toward manufacturers,” they added. “Use of a complementary approach targeted to nonmanufacturers to share information (e.g., graphics, plain language summaries) could allow the information to be more accessible for some stakeholders.”

The officials said that FDA has given marketing authorization to almost 700 AI/ML devices as of October 2023, and the number of premarket applications involving the technology continues to grow. They also noted that AI/ML products have unique considerations that need to be addressed, such as their usability, equity of access, potential for performance bias, continuous learning, and stakeholder accountability. To address these considerations, the agency published an action plan in 2021 to promote AI/ML transparency and held the workshop to bring stakeholders around the table to discuss how to achieve transparency.

“Workshop participants voiced that promoting and incorporating transparency is especially important for AI/ML devices as they are heavily data-driven, may incorporate algorithms exhibiting a degree of opacity, and can potentially learn and change over time,” said the authors. “Transparency can support the proper use of an AI/ML device, allowing stakeholders to understand the role of the device within a clinical workflow (for example, knowing if the device is intended to inform, augment, or replace judgment of the user) and make informed decisions.”

The officials note that considering the technical complexity of AI/ML devices, their development and concepts such as training data, locked and continuous learning algorithms, the medium used to disseminate the information is important when trying to engender user trust.

“This could include using language appropriate for differing literacy, technical literacy, and health literacy levels, as well as accommodating those with different learning styles and delivery preferences,” said the authors. “Workshop attendees identified that improving the transparency of AI/ML devices, especially concerning the communication of training, validation, and real-world performance, continues to be an area in need of further growth.”

To continue reading this article please go to RAPS .