The NO FAKES Act of 2025 establishes exclusive rights for individuals and their heirs to control the licensing and use of their digital voice and visual likeness replicas.
Christopher Coons
Senator
DE
The NO FAKES Act of 2025 establishes exclusive property rights for individuals over the creation and use of their digital voice and visual likenesses (Digital Replicas). This federal law grants the right holder the authority to license or control the use of these replicas, both during the individual's life and for a set period after death. The Act outlines specific licensing rules, liability for unauthorized use in commerce, and safe harbor provisions for online services that comply with takedown notices based on digital fingerprinting.
The new NO FAKES Act of 2025 aims to establish a federal right that gives every individual—living or dead—exclusive control over the use of their "Digital Replica." Think of a Digital Replica as any super-realistic, computer-generated copy of your voice or image used in a recording where you didn't actually perform. This bill is the government’s first major attempt to draw a clear legal line in the sand against deepfakes and AI impersonation, making it a property right you can control and sue over.
Under this bill, your voice and image, when digitally replicated, become a piece of property. This is a huge deal because it means that if someone uses AI to generate a copy of your voice for a commercial ad, or creates a realistic video of your likeness without permission, they’re infringing on a property right. For the average person, this means a new layer of protection against having your identity co-opted by AI. For artists and creators, it provides a powerful legal tool to manage their brand and livelihood in the face of increasingly sophisticated digital mimicry.
The bill sets strict rules for licensing: any agreement to use your digital replica must be in writing, signed, and can’t last longer than 10 years. If you’re under 18, the license is capped at five years and needs court approval, adding significant protection for young actors and influencers. However, while this protection is great for the individual, the bill creates a potential headache for small creators: if you use someone’s likeness without permission, even if you’re just a hobbyist making a viral video, you could face a minimum statutory damage of $5,000 per unauthorized work if sued.
One of the most complex parts of the NO FAKES Act deals with what happens after you die. This new right doesn't vanish with you; it passes to your estate or heirs (the Right Holders). They get exclusive control for 10 years, and here’s where it gets interesting: they can renew that control for subsequent five-year periods—potentially indefinitely—as long as they can prove they are actively and publicly using the likeness. The right only has a hard stop after 70 years if they stop renewing or the rights are not actively used.
For regular folks, this seems straightforward, but for cultural history, it’s a big shift. It means the estate of a long-deceased celebrity could theoretically maintain control over their image for much longer than current copyright laws allow, potentially restricting the public’s ability to use historical figures in new, transformative ways, like in documentaries or historical fiction, long after they would have otherwise entered the public domain. It raises the question: when does a famous person’s image stop being private property and start belonging to history?
For social media platforms and other Online Service Providers (OSPs)—think TikTok, YouTube, or Spotify—the bill offers a "safe harbor" similar to existing copyright law. As long as they adopt a policy to deal with repeat infringers and act quickly when they receive a proper takedown notice, they are generally not liable for what users upload.
This safe harbor comes with a technical challenge: if a platform is notified about an unauthorized replica, they must remove it and use a “digital fingerprint” to remove all other copies matching that fingerprint that were uploaded later. This is a powerful mandate that could lead to platforms over-censoring content to avoid massive $25,000 fines per work. On the flip side, the bill includes a tough penalty for anyone who files a knowingly false takedown notice, making them liable for the greater of $25,000 or actual damages. This is designed to deter bad actors from weaponizing the system to silence critics or competitors.