PolicyBrief
H.R. 2794
119th CongressApr 9th 2025
NO FAKES Act of 2025
IN COMMITTEE

The NO FAKES Act of 2025 establishes a new federal property right protecting an individual's voice and visual likeness from unauthorized creation and distribution of digital replicas, including provisions for licensing, post-death rights, and liability for online services.

Maria Salazar
R

Maria Salazar

Representative

FL-27

LEGISLATION

NO FAKES Act Creates New Federal Property Right for Digital Likenesses, Lasting Up to 70 Years Post-Mortem

The NO FAKES Act of 2025 (officially the Nurture Originals, Foster Art, and Keep Entertainment Safe Act) is the government’s big move to tackle AI deepfakes. It basically creates a brand-new federal property right: the exclusive authority for an individual—or their estate—to approve the use of their voice or image when it’s used to create a “Digital Replica.” Think of a Digital Replica as a super realistic, computer-generated copy of you that performs actions you never actually did. This right is a huge deal because it gives individuals, from performers to regular folks, federal legal standing to fight back against unauthorized AI recreations. It takes effect 180 days after enactment.

Your Face, Your Property: The New Digital Right

Under Section 2, your voice and visual likeness, when used to create a Digital Replica, become a property right that you can license but can’t permanently sell off while you’re alive. This is a massive shift from relying on patchwork state laws. The bill sets high stakes for unauthorized use, imposing minimum damages of $5,000 per violation—or $25,000 for entities—plus any profits made. If you’re a software developer whose product is designed to create unauthorized replicas, the penalty starts at $5,000 per product. This is designed to make commercial deepfaking a very expensive proposition.

The Long Shelf Life of a Likeness

One of the most complex parts of this bill is how it handles death. This new right doesn't disappear; it can be transferred or licensed by heirs. It lasts for 10 years after death, but here’s the catch: the right holder can renew it indefinitely in 5-year chunks, up to 70 years after death, as long as they can prove they actively and publicly used the likeness during the two years before the renewal period. Imagine being the executor for a deceased relative who was a minor celebrity; you now have to actively market their image every few years just to keep the legal right alive. This requirement for “active use” is definitely going to keep IP lawyers busy, as it forces estates to monetize or promote the likeness continuously, or lose the protection.

The Platform Shield: Safe Harbors and Takedowns

For platforms like YouTube, Spotify, or even smaller hosting sites (defined broadly as an “Online Service”), the bill offers a safe harbor, similar to existing copyright law. If a right holder sends a proper notice about an infringing replica, the platform generally isn't liable if they promptly remove the content and have a policy to boot repeat offenders. Crucially, once they get a notice, they must also try to remove any other copies of that exact material that match the “Digital Fingerprint” if those copies were uploaded later. This means platforms get a clear process for compliance, but it puts the burden on them to effectively police content once they’ve been alerted.

What About Existing State Laws?

This new federal law generally “preempts” or overrides existing state laws regarding digital replicas. This means that while it creates a uniform federal floor of protection, it potentially wipes out stronger protections some states might have already put in place. There are exceptions: state laws enacted before January 2, 2025, laws targeting sexually explicit digital replicas, and laws concerning election-related replicas are generally spared. But for the average person, this means their protection against a general deepfake is now defined by the federal government, not their state legislature. On the flip side, if someone tries to abuse the system by sending a false takedown notice, they face serious consequences: they can be sued for the greater of $25,000 per false notice or the actual damages caused. This provision aims to prevent weaponizing the new law to silence legitimate commentary or criticism.