The epidemic of out-of-control generative artificial intelligence in litigation filings has metastasized to a False Claims Act (FCA) lawsuit against a group of Utah anesthesiologists. On July 25, Mountain West Anesthesia, LLC and individual defendants in the case moved to bar the testimony of a medical billing expert whose report was riddled with AI-generated errors, including fabricated testimony from a government representative, fake citations to Medicare manuals, and nonexistent industry publications.
The suit alleged that Mountain West Anesthesia improperly billed Medicare and Medicaid for continuous anesthesia services supposedly rendered by physicians who actually were using their smartphones at the time. The relator (whistleblower) in the case, a vascular surgeon, said he personally witnessed this activity. Thomas J. Dawson III, an attorney who the relator designated as an expert, admitted in a deposition that he used ChatGPT to help him write the report.
In its motion, Mountain West said, “Courts have increasingly chastised experts and attorneys for blindly relying on AI to generate unreliable reports and other documents, and yet Mr. Dawson’s report stands out, even among those cases, for the remarkable number and extraordinary nature of the fabrications and errors.”