The "NO FAKES Act" establishes federal protections for individuals' voice and visual likenesses against unauthorized digital replicas, creating rights and remedies for misuse while providing safe harbors for online service providers.
Maria Salazar
Representative
FL-27
The NO FAKES Act of 2025 establishes federal protections for an individual's voice and visual likeness, creating a digital replication right that requires consent for the creation and use of digital replicas. This grants individuals and their estates control over their likeness, sets rules for licensing and transfer of these rights, and provides remedies for unauthorized use, while also creating safe harbors for online service providers. The bill balances protecting intellectual property with allowing for certain uses such as news, commentary, and parody. It also preempts state laws, with some exceptions, and applies to all individuals regardless of their date of death.
Congress is stepping into the rapidly evolving world of AI-generated content with the "NO FAKES Act of 2025." This proposed legislation aims to give individuals, both living and deceased, control over computer-generated, realistic copies of their voice or appearance – what the bill calls "digital replicas." Essentially, it establishes a new intellectual property right, meaning someone would need your permission (or your heirs' permission, if applicable) to create and use a highly realistic digital version of you that you didn't actually perform or approve.
The core idea is creating a "digital replication right." This right belongs to the individual during their lifetime and can be licensed out, but not permanently sold or given away. Licenses have limits: up to 10 years for adults, 5 years for minors (with court approval), unless covered by a collective bargaining agreement. After an individual passes away, this right doesn't just disappear. It becomes transferable property, managed by heirs or designated representatives. This post-mortem control lasts for an initial 10 years, potentially renewable up to 70 years after death if the likeness is still being used commercially. Think about the estates of famous actors or musicians – this gives them a legal framework to control how those likenesses are used in new digital forms long after they're gone.
So, who's responsible if an unauthorized digital replica pops up online? The bill casts a wide net. Anyone involved in distributing or publicly displaying an unauthorized replica, or even distributing tools designed to make them without permission, could face liability. This includes online platforms hosting user-uploaded content. However, there's a catch, similar to copyright law: platforms get a "safe harbor." They generally won't be liable for user uploads if they have a system to handle takedown notices from right holders and remove infringing content promptly. They need to register an agent (like they do for copyright) to receive these official complaints. But, file a bogus takedown notice? You could be on the hook for damages ($25,000 per misrepresentation, or actual damages).
Not every use of a likeness is restricted. The bill includes specific exemptions for news, public affairs, sports broadcasts, documentaries, historical or biographical works, commentary, criticism, satire, or parody. However, even these uses are prohibited if they involve sexually explicit content (as defined in existing law). If someone violates these new rights, the right holder (or their representative) can sue in federal court within three years of discovering the violation. Potential remedies include court orders to stop the use (injunctions), actual financial damages, or statutory damages ranging from $5,000 to $25,000 per violation. Punitive damages might be available for willful infringement. The bill also allows right holders to get court subpoenas to force online services to identify alleged infringers. Finally, this federal law aims to largely override existing state laws on the same topic, except for specific state rules already in place by January 2, 2025, concerning sexually explicit or election-related digital fakes. The whole system is slated to kick in 180 days after the bill becomes law.