PolicyBrief
S. 1367
119th CongressApr 9th 2025
NO FAKES Act of 2025
IN COMMITTEE

The NO FAKES Act of 2025 establishes federal protections for individuals' voice and visual likenesses against unauthorized digital replicas, creating a digital replication right that allows individuals and their estates to control and monetize their digital representations.

Christopher Coons
D

Christopher Coons

Senator

DE

LEGISLATION

NO FAKES Act Proposes New Rights Over Your Digital Voice and Likeness, Sets Fines Starting at $5,000

Ever worry about someone creating a deepfake video or an AI song using your voice without permission? The proposed NO FAKES Act of 2025 aims to tackle that head-on. Essentially, this bill creates a new federal property right – think of it like copyright, but for you – covering your digital voice and visual likeness. It establishes what's called a "digital replication right," giving individuals (and their heirs) control over how AI-generated versions of themselves are used commercially.

Your Face, Your Voice, Your Rights?

So, what does this "digital replication right" actually mean? It grants you, the individual, the exclusive authority to okay the use of your voice or image in a "digital replica." This right lasts your entire lifetime and continues for up to 70 years after death, managed by your estate. If someone wants to use an AI version of you in an ad, a product, or some other service, they'll generally need a written license signed by you (or your estate). For minors, these licenses have shorter terms and require court approval. The idea is to give people, especially performers whose livelihoods depend on their unique likeness, a legal tool to prevent unauthorized digital impersonation. This right applies even if you passed away before the law existed, though liability for misuse only applies to actions taken after the bill becomes law.

Faking It Can Cost You

The bill doesn't just create the right; it puts teeth into enforcing it. Anyone who creates or distributes an unauthorized digital replica for commercial purposes, potentially impacting interstate commerce, could be held liable. This includes companies making software specifically designed to pump out unauthorized replicas. However, there's a catch: liability generally requires knowledge. For platforms like YouTube or TikTok, this means getting a formal takedown notice. For others, it means knowing, or deliberately ignoring, that the replica is unauthorized. Penalties aren't small change: statutory damages start at $5,000 per violation and can go way up, potentially reaching $750,000 in some cases. Alternatively, rights holders can sue for actual damages plus the violator's profits, and potentially get punitive damages and attorney fees if the infringement was willful. Importantly, simply slapping a disclaimer saying "this is fake" won't get infringers off the hook.

Platforms, Parody, and Lingering Questions

What about online platforms where most of this content might appear? The bill includes "safe harbor" provisions, similar to copyright law. If platforms have a system to handle repeat infringers and promptly remove unauthorized replicas when properly notified, they generally won't be liable for user-uploaded fakes. There are specific rules for what constitutes a valid takedown notice. But be warned: filing a bogus notice could cost you $25,000 or actual damages.

The bill tries to balance protection with free expression. It explicitly carves out exceptions for news reporting, documentaries, historical works, commentary, criticism, satire, and parody – unless the replica depicts sexually explicit conduct. This is crucial for comedians, critics, and meme creators. However, the definition of "digital replica" itself isn't perfectly precise, which could lead to debates about whether a stylized caricature or a brief impersonation counts. This ambiguity might create headaches for creators trying to figure out where the line is. The law also largely overrides state-level laws on this topic, except for specific existing rules around sexually explicit fakes or election interference. If enacted, these rules kick in 180 days later.