The DEFIANCE Act of 2025 establishes a federal civil right of action against the creation, possession with intent to share, or sharing of nonconsensual sexually explicit digital forgeries, or "deepfakes."
Alexandria Ocasio-Cortez
Representative
NY-14
The DEFIANCE Act of 2025 establishes a federal civil cause of action against the creation, possession with intent to share, or sharing of nonconsensual intimate digital forgeries, often known as deepfakes. This legislation updates existing law to specifically address image-based sexual abuse created through AI or digital manipulation, regardless of whether the image is labeled as fake. Victims can sue for significant statutory damages, attorney's fees, and injunctive relief to stop the spread of the harmful content. The Act ensures that this new federal standard does not preempt or limit stronger existing State or Tribal laws.
The new DEFIANCE Act of 2025—officially the Disrupt Explicit Forged Images And Non-Consensual Edits Act—is a major federal response to the explosion of AI-generated intimate images, better known as deepfakes. Simply put, this bill creates a powerful new civil lawsuit option for people whose intimate images, whether real or digitally forged, are created or shared without their consent. It’s about giving victims a direct, federal tool to fight back against image-based sexual abuse, provided the activity touches interstate commerce, which most online sharing does.
One of the most important things this bill does is update the legal definitions to explicitly cover non-consensual digital forgeries. Until now, many laws struggled to keep up with AI that can paste your face onto someone else’s body so convincingly that even a reasonable person can’t tell it’s fake. This Act defines an “intimate digital forgery” as exactly that: a fake intimate picture created using AI or software that looks real. Crucially, even if the image has a tiny disclaimer saying it’s fake, it still counts if it meets the visual criteria. This means the defense of "It was just a joke" or "I labeled it AI" likely won't hold up in court if the image is being used to harm someone.
If you are the victim of a non-consensual intimate image or deepfake, the DEFIANCE Act allows you to sue the person who knowingly produced it, possessed it with intent to share, or shared it. And the financial penalties are serious. If you win, you can recover actual damages (like lost wages or therapy costs) plus statutory damages set at a minimum of $150,000. That amount jumps to $250,000 if the conduct was tied to sexual assault, stalking, or harassment against you. The court can also order the defendant to pay your attorney fees and command them to delete or destroy the image. This isn't pocket change; it’s designed to be a massive deterrent against creating or sharing this harmful content.
Lawsuits involving intimate images can be incredibly traumatic, often forcing victims to relive the abuse in public court filings. The Act addresses this by giving courts tools to protect the victim’s privacy. A judge can allow you to use a pseudonym (a fake name) throughout the case, redact your personal information from public records, or even seal documents. This means you can pursue justice and financial recovery without having your identity or the traumatic images themselves plastered across public court dockets. For busy professionals who rely on their reputation, this privacy protection is huge.
For the average person, this bill provides a necessary shield in the age of generative AI. If you’re a teacher, a small business owner, or anyone who has a public-facing role, the threat of a malicious deepfake ruining your career is real. This law gives you a clear path to hold the perpetrator accountable, backed by federal authority and substantial financial penalties. It also sets a long statute of limitations: you have 10 years from the time you reasonably discover the violation to file a complaint, recognizing that these images can surface years after they were created.
One important detail is that this federal law doesn't replace state or tribal laws; it sets a floor. States are free to pass even stronger protections. Also, the bill clarifies that if you’re suing the producer of the forgery, you must prove they knew you didn't consent to the creation and knew or should have known you would be harmed. While the high statutory damages are a strong deterrent, proving that level of knowing or reckless disregard can sometimes be challenging in a digital world where content spreads anonymously and rapidly. However, the overall message is clear: the federal government is finally providing victims with serious legal firepower to fight deepfake image abuse.