FDA Announces AI Councils Amid Calls for Greater Agency Transparency

Akin Gump Strauss Hauer & Feld LLP

Recently, it was reported that the U.S. Food and Drug Administration (FDA) is launching two cross-agency artificial intelligence (AI) councils. One AI council will be tasked with addressing how the agency uses AI internally and the other will focus on policy governing AI’s use in FDA-regulated products (reportedly pre-existing AI councils in various FDA divisions will continue to operate) (Politico Pro).

The agency’s internal usage of AI has been of particular interest in the last few months after the agency’s announcement in May that its first AI-assisted scientific review pilot was successful and directing all FDA centers to begin integrating certain AI-generated capabilities within FDA’s internal data platforms by the end of the following month. Then, in June, FDA launched Elsa, a generative AI tool designed to help employees work more efficiently. According to FDA, Elsa is designed to prepare information so that FDA staff can make decisions more efficiently, and that a human remains in the decision-making loop. FDA reports that Elsa models do not train on data submitted by regulated industry, which is intended to safeguard research and data handled by FDA staff. 

Regarding Elsa’s capabilities, several weeks after initially launching the platform, FDA’s chief AI officer, Jeremy Walsh, noted that Elsa is unlikely to be connected to the internet, which would prohibit it from accessing real time information (Regulatory Focus). While this approach was framed as a necessary security precaution, it could also hinder Elsa’s ability to produce up-to-date responses. In the days following these announcements, there were reports that the model, which is currently only trained on information through April 2024, provided inaccurate or incomplete information during its first week in use (NBC News). 

FDA is actively updating and improving Elsa, but questions and concern persists among the industry that FDA is potentially using a tool that might not have passed the agency’s own expectations for validation, governance and transparency of such tools when used for FDA-regulated functions. Presumably, the internally-focused AI council will be tasked with creating internal policies and procedures that ensure effective use of Elsa and other AI tools. However, the timeline and extent to which the agency will be transparent about these internal policies and procedures is, as of yet, unclear.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Akin Gump Strauss Hauer & Feld LLP

Written by:

Akin Gump Strauss Hauer & Feld LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Akin Gump Strauss Hauer & Feld LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide