[co-author: Stephanie Kozol]*
On September 9, Colorado Attorney General (AG) Phil Weiser issued a public advisory warning voters about the dangers of election misinformation and disinformation in the form of realistic-looking images, videos, and audio created using artificial intelligence (AI), known as “deepfakes.” The advisory follows the May 2024 enactment of HB24-1147, an act designed to prevent a broad range of actors from using deepfakes depicting candidates in political communications without properly disclosing the untruthful nature of the communication to voters.
Overview of HB24-1147
Earlier this year, Colorado lawmakers passed, and Governor Jared Polis signed into law, HB24-1147. The new law prohibits the distribution, dissemination, publication, broadcasting, transmission, or display of a communication concerning a candidate for elective office that includes a deepfake to an audience including members of the electorate for the elective office to be represented by the candidate. The act defines “deepfake” as an image, video, audio, or multimedia AI-generated content that falsely appears to be authentic or truthful, and which features a depiction of an individual appearing to say or do something the individual did not say or do.
The act’s prohibition, however, does not apply to a communication that includes a “clear and conspicuous” disclosure stating that “the [multimedia] has been edited and depicts speech or conduct that falsely appears to be authentic or truthful.” To be “clear and conspicuous,” the text of the disclosure must appear in a font size the same size as the largest font size of other text appearing in a visual communication. Similarly, to be “clear and conspicuous” in an audio communication, the disclosure statement must be read clearly and in the same pitch, speed, language, and volume as the majority of the audio communication.
Importantly, the act’s prohibition only applies 60 days before a primary election and 90 days before a general election.
Weiser’s Public Advisory
In the public advisory, Weiser outlined the critical components of the act which voters, candidates, and campaigns need to know about the new law. First, the advisory warns that no “person” (which includes natural persons, partnerships, committees, associations, corporations, labor organizations, political parties, or other group of persons) may publish deepfakes regarding candidates without proper disclosures. Second, the advisory outlined that the required disclosures must be clear and conspicuous, displayed or otherwise appear in the communication, and must also comply with the act’s requirements for exact font sizes, among other things.
Weiser also reminded the public that violations can result in legal action to prevent the dissemination of the deepfake in question, and that violators could be subject to severe financial liabilities or even criminal penalties.
Why It Matters
While HB24-1147 contains exemptions for, amongst others, paid radio and television broadcasts and satire, the act and Weiser’s public advisory highlight the need to exercise heightened diligence when distributing or sharing content concerning political candidates. As the 2024 presidential election nears, all actors who disseminate political multimedia should ensure that their standard operating procedures include reviews for deepfakes and the capability to make compliant disclosures upon the discovery of a deepfake.
*Senior Government Relations Manager