Take It Down Act: U.S. enacts law targeting sexually explicit deepfakes and “revenge porn”

Hogan Lovells
Contact

Hogan Lovells[co-author: Rafal Fryc, Chloe Suzman]

On May 19th, President Trump signed a new federal law targeting sexually explicit deepfakes and other intimate visual content posted online without consent. The law imposes time-sensitive takedown obligations on certain companies and establishes criminal penalties for publishing “nonconsensual intimate visual depictions.”

President Trump has signed into law the Take It Down Act (TIDA), which criminalizes the non-consensual publication of “intimate visual depictions” of identifiable individuals, including machine-generated imagery such as deepfakes. TIDA also requires covered platforms to have a mechanism to remove these depictions within 48 hours of receiving a valid takedown request.

First proposed by Senator Ted Cruz (R-TX) in June 2024, TIDA was co-sponsored by Senator Amy Klobuchar (D-MN) and passed unanimously in the Senate and by a vote of 409-2 in the House. TIDA also comes on the heels of a slew of state legislation regulating non-consensual intimate imagery (NCII). Some civil liberties organizations have criticized TIDA, arguing it could suppress speech and impair privacy.

“Covered Platforms” and Takedown Obligations

TIDA imposes new obligations on covered platforms to rapidly remove nonconsensual intimate visual depictions upon an identifiable individual’s request, which will take effect on May 19, 2026. Key components of the takedown regime include:

  • Covered Platforms: The takedown requirements apply to “covered platforms,” i.e. services that “primarily provide a forum for user-generated content” or who regularly host NCII. Internet service providers, email providers, and services that primarily publishes non-user generated content are excluded.
  • Removal: In response to a valid takedown request, covered platforms must remove the requested content within 48 hours. Businesses may consider whether and how to adapt existing processes for takedown requests under other legal regimes (e.g., the Digital Millenium Copyright Act) to facilitate TIDA requests.
  • Copies: Covered platforms are also required to make a reasonable search for any copies of the NCII and remove those as well. This may include an obligation to search for and remove hashed matches.
  • Liability Shield: Covered platforms that remove content based on reasonable evidence are shielded from liability for disabling access to content, even if the content is later determined to be lawful.
  • Notice to Consumers: Covered platforms must clearly notify users of their new removal procedure in “plain language” that is “easy to read.” Notice may be provided through a link, similar to existing privacy policies.
  • Requests: Requests for takedowns must identify the problematic content, provide a brief attestation that the depiction was published without consent, and include the contact information and signature of the identifiable individual or someone acting on their behalf.
  • Enforcement: The FTC is authorized to enforce the takedown provisions of TIDA. Non-compliance with the takedown provisions is subject to the FTC’s financial penalties for unfair or deceptive acts or practices, which exceed $50,000 per violation.

“Intimate Visual Depiction”

TIDA’s requirements hinge on whether online content constitutes an “intimate visual depiction” of a real person. Intimate visual depictions must display an “identifiable individual” that “appears in whole or in part” and “whose face, likeness, or other distinguishing characteristic (including a unique birthmark or other recognizable feature) is displayed.” The law diverges from other privacy regimes by stating that an individual’s disclosure of their own intimate visual depiction to another does not establish consent for publication.

TIDA also covers deepfakes, defined in the Act as “digital forgeries.” To qualify as a digital forgery, the intimate visual depiction must be generated by “software, machine learning, artificial intelligence, or any other computer generated or technological means,” and must be “indistinguishable” from authentic images according to a reasonable person.

Criminal Liability

Effective immediately, TIDA criminalizes knowingly publicizing non-consensual intimate visual depictions, with more stringent standards for intimate visual depictions involving minors. However, the law provides several notable exemptions from its criminal liability provisions, including “good faith” disclosures related to government investigations, law enforcement and courts, and medical or scientific purposes. Additionally, the law excludes intimate visual depictions covered by existing criminal statutes (e.g., for child sexual abuse).TIDA’s criminal liability provisions target those who publish prohibited content, not platforms that host content published by third parties.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Hogan Lovells

Written by:

Hogan Lovells
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Hogan Lovells on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide