Software Gains New Status as a Product Under Strict Liability Law

Morrison & Foerster LLP
Contact

Morrison & Foerster LLP

A recent lawsuit involving an AI chatbot represents another indication of a possible shift in how courts will approach software under traditional strict products liability principles. Traditionally, courts have been hesitant to treat software and other nontangible consumer goods as a “product” for purposes of strict liability claims. However, a recent decision from the U.S. District Court for the Middle District of Florida demonstrates courts’ progressing willingness to consider AI software and applications as products under strict products liability principles, which will have a significant impact on the landscape of products liability law.

Character A.I. is an app featuring AI chatbots known as “Characters.” Users have created millions of “Characters” on the app that mimic parents, girlfriends, characters from TV shows, and even concepts like “unrequited love.” In October 2024, a mother sued Character A.I. when her 14-year-old son committed suicide after allegedly becoming obsessed with the chatbot. The complaint alleges that the teen used Character A.I. to generate and interact with various AI chatbots prior to his death and sent the bots numerous messages expressing suicidal ideations.

The decedent’s mother asserted a strict liability design defect claim, arguing the company should be held strictly liable for her son’s death because the chatbot is a defectively designed consumer product and inherently dangerous to users. Character A.I. moved to dismiss the complaint by arguing, among other things, that the chatbot is not subject to strict products liability claims because it is software rather than a tangible good.

On May 21, 2025, the court issued an order rejecting Character A.I.’s motion to dismiss the design defect claim, explaining, “Plaintiff’s complaint contains allegations related to the content and related to the design choices of Character A.I. For example, Plaintiff complains about the sexual nature of [the minor’s] conversations with some Characters and remarks the Characters made about suicide.” The court also noted that even though the minor “may have been ultimately harmed by interactions with Character A.I. Characters, these harmful interactions were only possible because of the alleged design defects in the Character A.I. app. Accordingly, Character A.I. is a product for the purposes of plaintiff’s strict products liability claims so far as plaintiff’s claims arise from defects in the Character A.I. app rather than ideas or expressions within the app.”

A similar case against Character A.I. was filed in Texas federal court in December 2024. Character A.I. filed a motion to dismiss, but no order has yet issued.

The plaintiffs’ bar will continue the push to expand strict products liability claims to other software-based cases. In light of courts’ increased willingness to treat AI as products for strict liability claims, companies developing and deploying AI software should proactively address the risks and ensure compliance with emerging legal standards. We will continue to monitor these developments and provide updates.

Brian Buckley, a summer associate in our San Diego office, contributed to the writing of this article.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Morrison & Foerster LLP

Written by:

Morrison & Foerster LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Morrison & Foerster LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide