The TAKE IT DOWN Act combats the spread of nonconsensual intimate visual depictions, including deepfakes, by establishing criminal penalties for their intentional disclosure and requiring online platforms to implement notice and removal processes.
Maria Salazar
Representative
FL-27
The TAKE IT DOWN Act aims to combat the spread of nonconsensual intimate visual depictions, including deepfakes, online. It establishes criminal penalties for intentionally sharing such images without consent, with increased penalties for depictions involving minors. The Act also requires online platforms to create a process for users to report and request the removal of nonconsensual intimate visual depictions within 48 hours, enforced by the Federal Trade Commission. It defines key terms related to consent, digital forgeries, and covered platforms, while also ensuring that the rest of the law remains valid if any part is deemed unenforceable.
The TAKE IT DOWN Act aims to tackle the serious issue of intimate photos and videos, including AI-generated deepfakes, being shared online without permission. It proposes making it a federal crime to knowingly publish these kinds of 'intimate visual depictions'—both real and digitally forged—on websites or apps if the person depicted didn't consent and had a reasonable expectation of privacy. The bill sets out specific penalties, including fines and prison time, with potentially stricter consequences if the victim is a minor.
This bill adds teeth to combatting what's often called 'revenge porn' and the growing problem of deepfake exploitation. It amends the Communications Act of 1934 to establish clear offenses. If someone shares an actual intimate photo or video of an adult without their consent, intending to cause harm or knowing it likely will, they could face up to 2 years in prison and fines (Sec. 2). The same goes for sharing digitally faked intimate images of an adult. If the victim is a minor, sharing either real or fake intimate depictions with harmful intent could lead to up to 3 years in prison.
Importantly, the bill clarifies that just because someone consented to an image being taken, or shared it privately before, doesn't mean they consented to it being published publicly online (Sec. 2, Rules of Construction). Even threatening to share such images could carry penalties.
Beyond punishing individuals, the Act puts responsibility on the platforms where this content appears. Within one year, 'covered platforms'—essentially websites, apps, or online services that host user-generated content or regularly feature nonconsensual intimate images—must set up a clear process for victims to report and request removal (Sec. 3a). This doesn't generally include your internet provider or email service (Sec. 4).
How does it work? A victim (or someone authorized) needs to submit a request identifying the image, stating a good-faith belief it's nonconsensual, and providing contact info. Once a valid request is received, the platform has 48 hours to remove the specified image and make reasonable efforts to zap identical copies (Sec. 3a4). To encourage compliance, platforms get legal protection ('liability protection') for removing material in good faith, even if it's later determined not to fit the bill's definition perfectly (Sec. 3a5).
The Federal Trade Commission (FTC) gets the job of enforcing these platform rules (Sec. 3b). If a platform fails to create the process or remove content as required, the FTC can treat it as an 'unfair or deceptive practice,' similar to false advertising or other consumer protection violations. This means the FTC can investigate and potentially issue penalties, using its existing powers under the FTC Act.
In practical terms, this means if you find a nonconsensual intimate photo or deepfake of yourself online, this law would provide a specific federal pathway to demand its removal within two days from platforms hosting user content. It also creates federal criminal charges against the person who posted it. While the 48-hour window aims for speed, the effectiveness will depend on platforms implementing robust systems and the FTC actively enforcing the rules. The definitions, like 'reasonable expectation of privacy,' might also face real-world tests in court.