On June 30, 2025, the Honorable Paul Goetz of the New York Supreme Court held that plaintiff Norma Nazario’s lawsuit for the death of her son in a 2023 “subway surfing” incident could proceed against social media companies Meta Platforms, Inc., Tiktok, Inc. and ByteDance, Inc.
Background
On February 20, 2023, 15-year-old Zackery Nazario climbed on top of a moving J train in New York City and was struck by a low beam as the train was crossing the Williamsburg Bridge. The incident ultimately resulted in the boy’s death. Plaintiff Norma Nazario alleges that Zackery was addicted to Instagram and TikTok by late 2022, and although he did not seek content that depicted dangerous behavior, it was constantly pushed into his feed by the platforms’ targeted algorithms. In the pending lawsuit, she outlines several theories of product liability, claiming the unreasonably dangerous design of the social media platforms “targeted, goaded and encouraged” her son to “subway surf”.
The Parties’ Arguments
In their recent Motions to Dismiss, Defendants argued that Plaintiff failed to state a cause of action, centering their arguments predominantly on Section 230 of the Communications Decency Act (“CDA”) and the First Amendment.
The pertinent prevision of the CDA, Section 230, states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Defendants argued that they are protected by this provision because they satisfy the criteria of the CDA’s “immunity test”: 1) they are interactive service providers; 2) Plaintiff’s claims are based on subway surfing content provided entirely by third-party users; and 3) Plaintiff is improperly treating them as the publishers of users’ content.
Plaintiff responded that her claims do not treat Defendants as publishers based on content moderation or editorial decisions, but rather as product manufacturers. Plaintiff argued that Instagram and TikTok are ‘products’ under New York law because the platforms utilize algorithms to direct unsolicited content to users to keep them engaged. Plaintiff argued that Defendants should be liable for their failure to design a reasonably safe product and for their failure to furnish adequate warnings of foreseeable dangers arising from the products because in instances like Zackery’s, the algorithm promoted content that was extreme and dangerous.
The Court’s Reasoning
The Court ultimately determined that the defendants had not made any editorial decisions or material contributions to the subway surfing content so they may not be considered co-creators or receive protections provided by the CDA.
Previous case law regarding algorithms was considered non-actionable because the algorithms were content neutral or based on user inputs. Here however, the Court’s decision distinguished this matter by highlighting that Plaintiff’s claims were based on Defendants’ active choice to target and inundate Zackery with content depicting dangerous “challenges”. Plaintiff alleges that this content was intentionally fed to Zackery because of his young age and the Defendants’ desire to keep young users engaged with their applications for longer.
The Court held that some of Plaintiff’s claims should proceed to discovery as it was not clear whether Defendants exceeded the role of neutral assistance in promoting content. Plaintiff will be allowed to seek discovery in an effort to prove that the social media platforms identify and target users who are most impacted by specific content. The Court noted that while the Defendants may ultimately prove they are entitled to the protection of the CDA and that their applications are not subject to theories of product liability, that determination cannot be made without further discovery regarding how Zackery encountered the subway surfing online content.
The Court further held that the issue of whether the Defendants engaged in tortious conduct or, alternatively, protected speech pursuant to the First Amendment, would be determined through the discovery process.
Conclusion
In its final decision, the Court allowed the product liability and negligence actions to proceed and granted the motions to dismiss on the remaining allegations. This case highlights how plaintiffs are actively seeking new ways to exercise traditional theories of law, such as product liability, when suing social media platforms. Plaintiffs’ lawyers will likely incorporate this strategy into future filings—framing claims as product liability/design defect claim—to survive motions to dismiss and reach discovery. The final outcome of this matter has the potential for widespread impact on claims involving technological algorithms with targeting capabilities.
[View source.]