Last year, the U.S. Copyright Office commenced a far-reaching policy study concerning copyright and related issues raised by the widespread availability and use of artificial intelligence (AI). This week, the Office released the first in what is expected to be a series of reports arising from the study – this one focused on digital replicas, or as they are often called, “deepfakes.”[1] The report concludes that “new federal legislation is urgently needed” to address the potential harms of digital replicas.
Digital replicas are videos, images, and audio recordings that have been “digitally created or manipulated to realistically but falsely depict an individual.”[2] New AI technologies have made it fast, easy and inexpensive to produce high-quality digital replicas that raise concerns about effects on human creativity, markets for original creative works, fraud, election interference, and the proliferation of false sexually explicit imagery.[3]
In its study, the Office sought public comments on a host of AI-related issues, including whether existing laws are sufficient to protect against unauthorized digital replicas. After receiving and reviewing nearly one thousand comments concerning digital replicas (and about ten thousand comments in all), the Office recommended enactment of a new federal law to provide all individuals nationwide protection against unauthorized digital replicas (but not addressing other unauthorized uses of an individual’s persona). Two bills pending in Congress (one introduced this week) would provide such protection, although on somewhat different terms than recommended by the Office.[4]
Below, we highlight key takeaways from the Office’s report.
Limitations of Existing Protections
The report describes the patchwork of existing state and federal laws that prohibit the unauthorized use of an individual’s persona. The Office found these laws insufficient to address the full range of risks presented by digital replicas.
State Law. Most states recognize some form of the right of privacy or publicity, either by statute or under common law, and those rights can protect against unauthorized digital replicas. However, such protection is not available everywhere, and is limited in important respects. For example, depending on the state, protection may not extend to use of a simulated voice, be available to everyone, or extend to noncommercial uses such as deepfake pornography. The report also noted variations in how state law accommodates First Amendment concerns,[5] which has led to what “has been described by scholars as ‘a confusing morass’” of judicial decisions.[6]
The right to privacy is often “described as protecting against unreasonable intrusions into individuals’ private lives, safeguarding their autonomy, dignity, and personal integrity.”[7] It includes torts of false light and the appropriation of an individual’s name and likeness. Liability for the former requires the false light to be “highly offensive to a reasonable person,” a limitation the report notes will lessen “its applicability to other uses of unauthorized digital replicas, such as depictions that are merely untruthful.”[8] The utility of an appropriation claim is similarly limited, according to the report, including by courts’ disagreement regarding its applicability to noncommercial uses and individuals who are not well-known.[9]
The report describes the right of publicity as perhaps “the most apt state law remedy for unauthorized digital replicas.”[10] At a high level, it protects against unauthorized use of an individual’s persona in commercial contexts. In some states, that protection sweeps broadly. However, like the privacy-based appropriation tort, “the contours of the [publicity] right differ considerably from state to state,” and are sometimes “written too narrowly to cover all types of digital replica uses,” such as noncommercial uses.[11]
The report also notes that several states have enacted statutes specifically targeting AI-generated digital replicas, including Tennessee, Louisiana, and New York,[12] although it criticizes the New York and Louisiana laws for exempting simulations of the voice of a performer.[13]
Federal Law. At the federal level, the Copyright Act, FTC Act, Lanham Act, and Communications Act all play a role in protecting individuals from digital replicas, but incompletely:
- The Copyright Act is potentially implicated when digital replicas are created by ingesting or altering copyrighted works, but it does not “protect an individual’s identity in itself, even when incorporated into a work of authorship.”[14]
- The FTC Act prohibits various unfair methods of competition and unfair or deceptive practices. The FTC commented to the Office that use of a digital replica might violate the Act.[15] The FTC is also currently engaged in rulemaking directed at digital replicas.[16] However, its authority would be confined to “cases where digital replicas are used in commercially misleading ways.”[17]
- The report finds that causes of action under the Lanham Act for false endorsement and trademark infringement are limited by the need to show “commercial use and a likelihood of consumer confusion.”[18]
- Earlier this year, the FCC issued a rule prohibiting the use of voice cloning technology in robocall scams targeting consumers.[19] However, the report notes that uses such as “websites featuring user-generated content” would likely fall outside “the FCC’s enforcement purview.”[20]
The Office’s Proposed Federal Legislation
Given these limitations of current law, the report emphasizes the urgent need for comprehensive federal legislation targeting digital replicas with “a new digital replica right.”[21] The contours of that right as recommended by the Office are described below.
Subject Matter. The report recommends that legislation target digital replicas in a manner more focused than state rights of publicity. Specifically, it proposes targeting “replicas that convincingly appear to be the actual individual being replicated.”[22]
Persons Protected. The report advocates that the right extend to “all individuals regardless of their level of fame or the commercial value of their identities.”[23]
Term of Protection. Noting a debate as to whether protection should continue postmortem or terminate at death, the report concludes that federal legislation “should prioritize the protection of the livelihoods of working artists, the dignity of living persons, and the security of the public from fraud and misinformation regarding current events,” making postmortem protections “not necessary.”[24] To the extent such protections are incorporated into legislation, the report recommends that they be limited to an initial 20-year term, with the possible option for extension.[25]
Infringing Acts. The report recommends “proscribing activities that involve dissemination to the public,” but not the mere “creation of a digital replica,” at least absent some connection to dissemination.[26] It proposes that the right encompass both commercial and non-commercial uses.[27] However, it proposes that direct liability require “actual knowledge both that the representation in question was a digital replica of a real person, and that it was unauthorized.”[28]
Recognizing that digital replicas are “generally distributed and displayed online through the services of various intermediaries,” the report suggests that ordinary principles of secondary liability should apply.[29] The report recommends treating the digital replica right as an intellectual property right excluded from Section 230 of the Communications Decency Act, meaning that online service providers would not be exempt from the new right,[30] and instead endorses “a notice and takedown system, combined with an appropriate safe harbor” for service providers that expeditiously remove digital replicas after acquiring “actual knowledge” or a “sufficiently reliable notification that the replica is infringing.”[31]
Licensing and Assignment. Responding to concerns about the potential for abuse if individuals could assign and license the right to create digital replicas of them, the report recommends “a ban on outright assignments” and licensing subject to “appropriate guardrails.”[32] Specifically, the report suggests limiting licenses for creation of new digital replicas to five- or ten-year terms.[33] The report proposes additional requirements to ensure informed consent and protect the rights of minors.[34]
First Amendment Concerns. Because digital replicas may be used in the context of speech protected by the First Amendment, there has been considerable debate about the best way to ensure that a new law accommodates protected activities. Some commenters advocated for specific statutory carve-outs, such as for expressive works, comment and criticism, or parody, while others argued for a case-by-case balancing approach.
The report found that “[e]ach of these approaches has advantages and disadvantages.”[35] While the Office acknowledged the predictability of categorical exemptions, which are common in many state right of publicity statutes, the report characterized that approach as being potentially “over- and under-inclusive depending on the facts.”[36] Ultimately, it recommended a balancing approach that would take into account factors such as “the purpose of the use, including whether it is commercial; its expressive or political nature; the relevance of the digital replica to the purpose of the use; whether the use is intentionally deceptive; whether the replica was labeled; the extent of the harm caused; and the good faith of the user.”[37] The report contended that such an approach would be more tailored and “permit[] greater flexibility.”[38]
Remedies. The report advocates that any federal law include both monetary and injunctive relief. It contemplates not only actual damages, including “loss of income, damage to reputation, or emotional distress,” but also special damages to enable fair recovery opportunities for “those who may not be able to show economic harm or afford the cost of an attorney.”[39] The report proposes criminal liability for sexually-explicit content and “particularly harmful or abusive imagery.”[40]
Relationship to State Laws. Finally, the report recommends against preempting state laws, given settled expectations, a desire to avoid reducing protection, and a desire to preserve states’ flexibility to respond to a rapidly changing technological landscape.[41]
The report specifically addresses the relationship between state law and section 114(b) of the Copyright Act, a provision enabling the creation of so-called “soundalike” recordings without infringement of sound recording copyrights. The report finds that “section 114(b) does not preempt state laws prohibiting unauthorized voice replicas,” noting that “[c]opyright and digital replica rights serve different policy goals.”[42] To confirm that result, the Office recommends that Congress clarify that section 114(b) does not preempt state laws or affect the proposed new federal digital replica right.
Looking Ahead
Various bills are pending in Congress to address specific types of unauthorized digital replicas, including intimate depictions and political advertisements. The report highlights two bills that would address the unauthorized use of digital replicas more broadly: the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (“No AI FRAUD”) Act and the Nurture Originals, Foster Art, and Keep Entertainment Safe (“NO FAKES”) Act, which had been under discussion for some time and was introduced the day the Office released its report.[43] These bills differ from the Office’s recommendations in important details, including by providing certain postmortem protection and in their treatment of First Amendment concerns. For example, the NO FAKES Act has specific exclusions for certain news reporting, documentary and other expressive uses yet has the support of commenters on both sides of the First Amendment debate before the Office. This presidential election year is a challenging time for enactment of any legislation, but the Office’s report could smooth progress of digital replica legislation later this year or in the next Congress.
Footnotes
[1] U.S. Copyright Office, Copyright and Artificial Intelligence Part 1: Digital Replicas at 22 (July 2024) [hereinafter “Report”], available at https://www.copyright.gov/ai/Copyright-and-Artificial-Intelligence-Part-1-Digital-Replicas-Report.pdf.
[2] Report at 2.
[3] Prominent examples include the recording “Heart on My Sleeve,” with unauthorized simulated voices of the artists Drake and The Weeknd, and advertisements with unauthorized simulated likenesses of Tom Hanks and Gayle King. See Joe Coscarelli, An A.I. Hit of Fake ‘Drake’ and ‘The Weeknd’ Rattles the Music World, N.Y. Times (Apr. 24, 2023), https://www.nytimes.com/2023/04/19/arts/music/ai-drake-the-weeknd-fake.html; Derrick Bryson Taylor, Tom Hanks Warns of Dental Ad Using A.I. Version of Him, N.Y. Times (Oct. 2, 2023), https://www.nytimes.com/2023/10/02/technology/tom-hanks-ai-dental-video.html.
[4] NO FAKES Act, S. ___, 118th Cong. (2024); No AI FRAUD Act, H.R. 6943, 118th Cong. (2024).
[5] Report at 14.
[6] Report at 43-44 (quoting Gloria Franke, The Right of Publicity vs. the First Amendment: Will One Test Ever Capture the Starring Role?, 79 S. Cal. L. Rev. 945, 946 (2006)).
[7] Report at 8.
[8] Report at 9.
[9] Report at 10.
[10] Report at 11.
[11] Report at 11-12.
[12] Report at 15-16.
[13] Report at 51-52.
[14] Report at 17.
[15] Report at 18.
[16] Report at 18-19.
[17] Report at 24.
[18] Report at 20.
[19] Report at 20-21.
[20] Report at 24.
[21] Report at 28.
[22] Report at 29.
[23] Id.
[24] Report at 32.
[25] Report at 32-33.
[26] Report at 33.
[27] Report at 34-35.
[28] Report at 35.
[29] Report at 36.
[30] Report at 36-38.
[31] Report at 39.
[32] Report at 41.
[33] Report at 41-42.
[34] Report at 42.
[35] Report at 46.
[36] Id.
[37] Report at 46-47.
[38] Report at 46.
[39] Report at 47.
[40] Report at 48.
[41] Report at 50.
[42] Report at 52.
[43] Report at 26.
[View source.]