The TAKE IT DOWN Act criminalizes the intentional sharing of nonconsensual intimate images and deepfakes while requiring online platforms to establish clear procedures for their rapid removal upon request.
Ted Cruz
Senator
TX
The TAKE IT DOWN Act establishes new federal crimes for the intentional sharing of nonconsensual intimate images, including realistic AI-generated deepfakes, with enhanced penalties for content involving minors. It mandates that online platforms create clear reporting systems and remove reported nonconsensual intimate content within 48 hours. The Federal Trade Commission (FTC) is empowered to enforce these notice and removal requirements against covered platforms.
| Party | Total Votes | Yes | No | Did Not Vote |
|---|---|---|---|---|
Republican | 220 | 207 | 2 | 11 |
Democrat | 213 | 202 | 0 | 11 |
The new TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act) creates serious federal crimes for knowingly sharing nonconsensual intimate images—whether they are real photos or realistic digital forgeries (deepfakes). Crucially, the bill also forces online platforms to set up a mandatory reporting system that requires them to pull down reported content within 48 hours. This legislation is a significant response to the rise of deepfakes and the persistent problem of so-called 'revenge porn,' establishing a clear legal path for victims to fight back.
Section 2 of this Act updates federal law to make the nonconsensual sharing of intimate visuals across state or international lines a crime. The penalties are harsh, including potential prison time and mandatory forfeiture of any property gained from the crime, plus restitution to the victim. For adults, sharing a real intimate image is illegal if the poster intended to cause harm, or if the sharing actually causes harm (like reputational damage), and the image wasn't taken in public or a matter of public concern. This means if you share an ex-partner’s private photo intending to ruin their career, you’re now looking at federal time. For minors, the rules are stricter: posting any intimate image (real or fake) is illegal if done with the intent to abuse, humiliate, or for sexual gratification.
Critically, the law spells out a strict definition of consent: agreeing to have a picture taken, or even sharing it with one person, does not equal consent to publish it widely online. This closes a loophole often exploited by offenders. Furthermore, the bill directly tackles deepfakes, making it illegal to knowingly post a realistic digital forgery of an adult without consent if it causes or intends to cause harm. This is a huge deal for anyone whose image could be weaponized by AI technology, offering a specific legal shield against this emerging threat.
Section 3 puts a massive compliance burden on what it calls “covered platforms”—essentially any public website or app that primarily hosts user-generated content or regularly profits from sharing nonconsensual intimate visuals. Within one year, these platforms must implement a clear, plain-language system for identifiable individuals to report and request the removal of their nonconsensual intimate images. Once a valid request is received, the platform must remove the content within a tight 48-hour window and make a "reasonable effort" to scrub identical copies from their service.
This rapid response time is great for victims, who often see their images spread exponentially in hours. However, it creates a tight squeeze for platforms, which now have a strong incentive to remove content quickly to avoid liability. The bill grants platforms liability protection if they remove content in good faith, even if it later turns out the image wasn't actually nonconsensual. This protection could lead to platforms over-censoring or removing borderline content just to be safe, potentially affecting content that might fall under vague exceptions like 'matter of public concern.'
If a covered platform fails to follow these notice and removal procedures, the Federal Trade Commission (FTC) is empowered to step in. The FTC can treat non-compliance as an unfair or deceptive business practice, giving them significant leverage over these platforms, including non-profit organizations. This grants the FTC a serious new role in online content moderation.
For everyday users, the impact is twofold. On the one hand, this bill offers powerful protection against a devastating form of online abuse, providing a clear path to get content taken down fast. On the other hand, the combination of vague terms (like 'matter of public concern') and the strong incentive for platforms to over-remove content could potentially lead to content being taken down based on ambiguous claims, impacting online expression. This law represents a major step forward for digital privacy, but its implementation will hinge on how platforms interpret those tight deadlines and vague exceptions.