The TERMS Act mandates that online service providers clearly disclose their account restriction policies beforehand and provide advance notice before suspending or terminating user accounts, subject to FTC enforcement.
Ted Cruz
Senator
TX
The TERMS Act mandates that online service providers clearly disclose their rules and processes for account termination or suspension *before* users sign up. This legislation requires providers to give advance written notice of most account restrictions, detailing the violation and appeal options. Furthermore, providers must publish annual transparency reports detailing enforcement actions taken against users. The Federal Trade Commission (FTC) is tasked with enforcing these new standards.
The Transparency in Enforcement, Restricting, and Monitoring of Services Act, or the TERMS Act, is a direct response to the frustration users feel when their online accounts—think social media, cloud storage, or even that niche forum—get suspended or terminated without warning or explanation. This bill aims to bring some much-needed daylight and due process to the murky world of online content moderation and account restriction.
What the TERMS Act does is simple but powerful: it forces online service providers to clearly spell out their rules for kicking you off the platform before you sign up. If you’ve ever scrolled past the Terms of Service, this bill makes those terms matter. Specifically, Section 4 requires providers to publicly post an "acceptable use policy" that details exactly which actions can get you restricted, how they enforce those rules (including naming any third-party contractors they use), and whether you have any right to appeal the decision.
This is huge for anyone running a small business on Instagram or relying on a cloud service for work. No more guessing games about what constitutes a violation. For example, the policy must explicitly state whether things you do off their platform—like a controversial post on a competitor's site—can be used as grounds to restrict your account with them. If you’re a photographer relying on a specific hosting site, you now get to see the risks upfront and choose your provider accordingly.
Perhaps the most significant change for the average user is the introduction of a mandatory warning period. Under Section 5, if a provider decides to restrict your account for violating their rules, they generally must give you a written warning at least seven days before the restriction takes effect. That notice must explain the specific behavior that caused the issue and exactly how that behavior violated the acceptable use policy. Crucially, it must also outline the appeals process, if one exists.
Think of this as your digital Miranda warning. That seven-day window gives you a chance to appeal, fix the issue, or, at the very least, download your data before you lose access. This advance notice requirement applies to any online service provider—defined broadly as any company that requires a unique account or profile and operates across state lines. However, the provider can bypass this seven-day notice if they need to act immediately due to a court order, federal law, or an "imminent risk of death, serious physical injury, or a serious health risk." While those exceptions make sense, the "imminent risk" clause is broad enough that the FTC will need to keep a close eye on how platforms use it to ensure it doesn't become a loophole.
To keep providers honest, the TERMS Act (Section 6) mandates annual transparency reports. No later than one year after the law passes, every service provider must publish a public, machine-readable report detailing all enforcement actions taken. This report must break down how many accounts were permanently banned, temporarily suspended, or limited, categorized by the specific rule broken and who reported the violation (users, automated systems, government agencies, etc.).
This reporting requirement is a massive win for public accountability. It means researchers, journalists, and even competing businesses can finally see the statistics behind platform moderation. If a platform claims to be cracking down on a specific type of abuse, the numbers will now show if those actions are actually being taken, and how often those decisions are later reversed on appeal. It’s like getting to look under the hood of the moderation engine.
Enforcement of all these new rules falls to the Federal Trade Commission (FTC) under Section 7. Violating the TERMS Act will be treated the same way as an unfair or deceptive business practice under the FTC Act. Interestingly, the bill explicitly brings nonprofit organizations that offer online services under the FTC’s enforcement umbrella for this law, overriding some standard limitations. For online service providers, this means new compliance costs and reporting burdens, but for users, it means there’s a powerful federal agency tasked with ensuring these transparency rules are actually followed.