The DEFIANCE Act of 2025 establishes a federal civil cause of action against the non-consensual production, possession, or sharing of intimate digital forgeries, allowing victims to seek substantial damages and injunctive relief.
Richard Durbin
Senator
IL
The DEFIANCE Act of 2025 establishes a federal civil right of action against the non-consensual creation, possession, or sharing of sexually explicit "intimate digital forgeries," such as deepfakes. This legislation defines these forgeries and allows victims to sue producers or sharers for significant statutory damages, attorney's fees, and injunctive relief. The Act aims to provide victims with a clear legal remedy against image-based sexual abuse facilitated by modern technology while preserving existing state laws.
The DEFIANCE Act of 2025—which stands for the Disrupt Explicit Forged Images And Non-Consensual Edits Act—is a major federal move to fight back against the explosion of AI-generated intimate images, often called 'deepfakes.' This bill creates a powerful new federal civil lawsuit that allows victims to sue anyone who knowingly creates, possesses with the intent to share, or actually shares these non-consensual fake intimate images. Essentially, if someone uses AI or software to put your face or likeness onto a sexually explicit image without your permission, you now have a direct path to federal court to fight back and seek serious damages.
Congress is pretty clear about why this is necessary: creating realistic fake images is now incredibly easy, and the harm caused by sharing them without consent is devastating. The bill introduces the term “Intimate Digital Forgery,” defining it as a fake intimate picture of an identifiable person created using software or AI that looks exactly like a real photo to a reasonable person. It specifically states that it doesn't matter if the image has a tiny label saying it's fake; if it looks real, it counts. This is a crucial detail because it cuts through the argument that simply labeling a deepfake as 'fake' removes the harm. The core of the bill is expanding existing federal law to cover the production and possession of these forgeries, not just the sharing, provided the action affects interstate commerce—a broad term that covers most internet activity.
For victims, the biggest change is the financial leverage this bill provides. If you win a lawsuit under the DEFIANCE Act, you can recover litigation costs, attorney fees, and damages. The bill sets a high bar for damages: you can get either actual damages (like lost wages or therapy costs) or set liquidated damages of $150,000. That number jumps to $250,000 if the conduct was linked to actual or attempted sexual assault, stalking, or harassment. This is significant because it means victims don't have to prove massive financial loss to hold the perpetrator accountable; the law acknowledges the inherent damage done.
For the person creating or sharing the deepfake, this means the penalty is steep and immediate. If someone is using AI tools to create and distribute non-consensual intimate images, they are now exposed to a quarter-million-dollar lawsuit, plus having to pay the victim's legal bills. The court can also issue injunctions, forcing the defendant to delete or destroy the forgery, which gives victims a crucial tool to stop the spread.
The bill also recognizes the inherent privacy risk involved in bringing these lawsuits. It includes specific provisions designed to protect the victim during the legal process. For instance, the court can allow the victim to use a pseudonym (a fake name) in public filings, redact personal information, or even seal documents. They can also issue protective orders ensuring that the intimate image or forgery stays in the court's custody during discovery. This is smart because it prevents the legal process itself from becoming a secondary source of trauma or further disclosure. Finally, victims get a long time to sue: up to 10 years after they reasonably discover the violation, or after they turn 18, whichever is later. This is a practical recognition that victims often discover these images long after they are first created.