PolicyBrief
H.R. 5272
119th CongressSep 10th 2025
Protect Elections from Deceptive AI Act
IN COMMITTEE

This Act prohibits the knowing distribution of materially deceptive AI-generated audio or visual media of federal candidates near an election, with exceptions for legitimate news coverage and satire.

Julie Johnson
D

Julie Johnson

Representative

TX-32

LEGISLATION

New Act Bans Deepfakes in Federal Elections: Candidates Can Now Sue Distributors Over Deceptive AI Media

The “Protect Elections from Deceptive AI Act” aims to tackle one of the biggest digital threats to democracy: deepfakes. Simply put, this bill makes it illegal to knowingly distribute materially deceptive, AI-generated audio or visual media—think hyper-realistic fake images or videos—of a federal candidate if the goal is to sway an election or raise money. The core idea is to stop people from using sophisticated AI to make it look like a candidate said or did something they absolutely did not, especially right before an election.

The AI Test: When Is a Deepfake Illegal?

The law targets media created using AI or machine learning that is so convincing a “normal person” would fundamentally misunderstand what the candidate actually looked like, said, or did. This is the key legal test: would a reasonable person believe the content is real and accurate? If you knowingly share this kind of fake media to influence a vote, you’re in violation. The law isn't targeting all AI-generated content, only the stuff that is materially deceptive and shared with intent to influence a federal race. This focus on intent and the “knowingly distribute” clause means the average person who accidentally shares a deepfake they found on social media likely won't be targeted, but organized political groups or individuals trying to weaponize this tech certainly could be.

Exceptions: The Free Speech Safety Net

Recognizing the importance of legitimate news and commentary, the bill includes crucial exceptions. If you’re a broadcast station, streaming service, or news publisher, you can show the deceptive media as part of a news report, interview, or commentary—but with a huge caveat: you must clearly tell your audience or readers that the content is questionable or fake. This is the bill’s attempt to balance election integrity with the First Amendment. Crucially, the bill also exempts satire and parody, meaning political comedians and meme-makers who are clearly making a joke won't have to worry about a lawsuit. If your content is obviously a joke, you’re safe.

What Happens When a Candidate Gets Deepfaked?

This is where the law gives candidates some serious muscle. If a candidate’s voice or image is used in violation of this act, they can immediately sue the distributor. They can ask a court for an injunction to stop the distribution right away—a critical feature since deepfakes can go viral in hours. Furthermore, they can sue for damages, and winning this type of case counts as defamation per se, meaning the court recognizes the harm immediately. However, there’s a catch for the candidate: they must prove their case using “clear and convincing evidence.” This is a high bar in civil court, making it harder to win than in a typical civil lawsuit. It’s a protection for the distributor, ensuring the law isn't used to silence legitimate, albeit critical, political speech, but it could make it tough for a candidate to quickly shut down a harmful deepfake.

Real-World Impact and Implementation Hurdles

For voters, this bill is a welcome layer of protection against highly sophisticated misinformation. For political campaigns and committees, it’s a clear warning: don't use AI to lie about your opponent. The biggest challenge in implementation will be defining that “materially deceptive” line. What one person finds obviously fake, another might believe is real. This subjectivity, combined with the high burden of proof required in court, means we’re likely to see some legal battles early on as judges try to figure out exactly what standard the “normal person” test sets in the age of viral content.