On Monday, Colorado Attorney General Phil Weiser issued a warning for voters to be vigilant for election disinformation and misinformation taking the form of “deep-fakes.” Weiser’s office accompanied this warning with a public advisory providing more information to Coloradans on a new law aimed at preventing political campaigns from using AI-generated deep-fakes without clear disclosure of the content’s origins.
The legislation referenced is House Bill 24-1147, which was signed into law by Gov. Jared Polis earlier this year. The law defines deep-fakes as “an image, video, or multimedia AI-generated content that falsely appears to be authentic or truthful and which features a depiction of an individual appearing to say or do something the individuals did not say or do.”
The law requires anyone using artificial intelligence to create communications to voters that feature images, video, or audio of candidates for office to have a clear disclaimer indicating that the content is not real. Those who fail to provide a disclaimer can face penalties such as fines.
“Because images, videos, and audio created with artificial intelligence are becoming difficult to distinguish from the real thing, you should be cautious when forming opinions based on what you see and hear online, on TV, and receive in the mail,” Weiser said in a press release. “The sad reality is that even AI-powered tools designed to detect these deepfakes have difficulty catching them. I encourage voters to do your research, get your news and information from trusted sources, and be mindful that the sophistication of AI means you can’t always believe what you see and hear anymore.”
The public advisory put out by Weiser lays out what it says voters, candidates, and campaigns ought to know about the implementation of the new law:
- Required disclosures must be clear and conspicuous. “A disclaimer notifying voters that the content ‘has been edited and depicts speech and conduct that falsely appears to be authentic or truthful’ must be displayed or otherwise appear in the communication, and the law provides for exact font sizes and other requirements.
- There are exemptions made under the law that allow for news outlets to discuss deep-fake content in new coverage, provided that the coverage makes it clear that the content includes a deep-fake. Radio and television broadcast stations are also exempt if they run political ads containing deep-fakes that are not properly identified. Satires and parodies are exempt as well.
- Violating the law can result in legal action to ‘prevent dissemination’ of the deep-fake content in question and violators could face financial liabilities or criminal penalties.
Finally, the law applies to communication voters within 60 days of a primary election and 90 days of a general election. With 54 days until the general election on November 5, the protections of the law are currently in effect.