The TAKE IT DOWN Act establishes federal criminal penalties for sharing nonconsensual intimate images (real or digitally forged) and mandates that covered online platforms remove such content within 48 hours of receiving a valid request.
Maria Salazar
Representative
FL-27
The TAKE IT DOWN Act establishes federal criminal penalties for the intentional sharing of both real and digitally forged nonconsensual intimate images, with stricter penalties for images involving minors. It mandates that covered online platforms create a clear process for victims to request the removal of such content within 48 hours. Failure by platforms to reasonably comply with these removal requests will be treated as an unfair or deceptive practice enforced by the Federal Trade Commission (FTC).
The newly proposed TAKE IT DOWN Act—short for the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks Act—is a major federal push to combat the sharing of nonconsensual intimate images, often called 'revenge porn,' and, critically, those created using AI technology, known as deepfakes.
This bill doesn't just expand existing law; it creates new federal crimes. If you knowingly post an intimate image—whether it’s a real photo or a digitally forged one—of an adult without their consent, and you do it with the intent to cause harm (psychological, financial, or reputational), you could face fines and up to two years in federal prison. If the image involves a minor, the penalties jump to up to three years. This is a game-changer because it specifically addresses the threat of realistic AI-generated images, which previously existed in a murky legal area. The law is clear: just because someone consented to the photo being taken doesn't mean they consented to it being published.
For most people, the biggest impact is the new pressure this puts on major online services, or “covered platforms.” Think social media sites, forums, and any service that primarily hosts user-generated content. Under Section 3, these platforms must establish a clear, easy-to-use system for victims to report nonconsensual intimate images of themselves. Once a platform receives a valid takedown request, they have a hard deadline: they must remove the image within 48 hours. They also have to make a “good effort” to remove any exact copies of that image they might also be hosting.
This is a huge win for victims, who often face a slow, frustrating process trying to get harmful content removed. The law gives them leverage and a concrete timeline. For the platforms, Section 3 provides a safety net: if they remove content in good faith, genuinely believing it was nonconsensual, they are generally protected from lawsuits, even if the content wasn't technically illegal. This provision is designed to encourage swift action, but it raises a slight concern: platforms might be incentivized to remove first and ask questions later, potentially leading to the accidental removal of legitimate content.
If a covered platform fails to follow these notice and takedown rules reasonably, the Federal Trade Commission (FTC) steps in. The bill states that such a failure will be treated as an unfair or deceptive business practice, giving the FTC the power to enforce compliance. This is important because it means the FTC can hold platforms accountable for poor or slow response times. Interestingly, the definition of a “covered platform” specifically excludes basic internet service providers, email services, and sites that primarily feature pre-selected content, unless those sites regularly host nonconsensual images. So, your ISP or Netflix isn't sweating this, but sites where users post their own stuff definitely are.
For the average person, this bill offers significant protection against a growing digital threat. If you are a victim of image-based abuse, whether from a former partner or a malicious actor using AI to create a fake image of you, you now have a direct, federally mandated path for getting that content taken down quickly. The penalties for those who threaten to post these images are also clearly defined, carrying their own prison terms (up to 30 months for threats involving minors).
In short, the TAKE IT DOWN Act draws a clear line in the sand: your privacy over intimate images, real or fake, is protected by federal criminal law. It forces the digital world to respond quickly to this type of abuse, giving victims a powerful tool to reclaim their digital space.