PolicyBrief
H.R. 6484
119th CongressDec 5th 2025
Kids Online Safety Act
IN COMMITTEE

The Kids Online Safety Act mandates that covered online platforms implement reasonable policies and safeguards to protect minors from specific online harms, subject to independent audits and FTC enforcement.

Gus Bilirakis
R

Gus Bilirakis

Representative

FL-12

LEGISLATION

Online Safety Bill Mandates Default Safety Settings for Minors, Annual Audits for Platforms, and Grants FTC New Powers

The Kids Online Safety Act (KOSA) is essentially a massive rulebook for how social media and similar sites—what the bill calls "Covered Platforms"—have to treat users they know are under 17. If you’re a parent, or just someone who uses a big social media site, this bill is going to change how those platforms operate. The core requirement is that platforms must create, implement, and enforce policies to address specific harms to minors, including threats of physical violence, sexual exploitation, the distribution of narcotic drugs, and financial harm caused by deceptive practices (Sec. 3).

The New Default: Safety First

For any user the platform knows is a minor (under 17), the platform must now apply a set of safeguards, and here’s the kicker: the default setting must be the most protective level of privacy and safety the platform offers (Sec. 4). This means no more signing up a 15-year-old and having to wade through settings to lock things down. These default safeguards must limit the ability of strangers to communicate with the minor and limit "design features that cause compulsive usage." Think infinite scrolling, auto-play videos, and constant notifications—all those features designed to keep eyes glued to the screen (Sec. 2). Now, if the platform knows you're a minor, they have to turn the compulsion down by default. They also have to provide an easy-to-use option for minors to limit their own time spent on the platform.

Parental Controls Get Serious

If the platform knows the user is a child (under 13), the bill mandates a whole new level of parental control. Parents must be given tools to view and change the child’s privacy and account settings, restrict purchases, and view and limit total time spent on the platform (Sec. 4). For users under 13, these parental tools must be enabled by default. This is a huge shift in control, moving the power to protect kids from the child’s device to the parent’s hands. Critically, the platform must give clear notice to the minor when these parental tools are active, so there’s no sneaking around the time limit.

The Annual Report Card

To ensure platforms are actually doing what they say, the bill requires an annual, independent, third-party audit (Sec. 6). This isn't just a check-the-box exercise. The auditor has to assess how likely minors are to access the platform and then report specific metrics to the FTC, including the number of U.S.-based minor users, the median time minors spent on the platform, and how many times the new safeguards and parental tools were used. For platforms, this means a massive new compliance cost and a permanent spotlight on their safety efforts. For the rest of us, it means the FTC will start getting real data on how much time kids are actually spending on these sites.

Who’s Policing the Internet?

Enforcement is split between the feds and the states. Any violation of this Act is treated as an unfair or deceptive act by the Federal Trade Commission (FTC), giving them the muscle to prosecute violations (Sec. 7). However, State Attorneys General (AGs) are also empowered to file civil lawsuits to stop violations and seek damages for their residents. This dual enforcement structure means platforms will face pressure from multiple jurisdictions, which significantly raises the stakes for non-compliance.

The Trade-Off: Federal Preemption

Here’s a detail that might fly under the radar but has massive implications: Section 10 explicitly prohibits states and local governments from creating or enforcing any laws that relate to the provisions of this Act. This is called federal preemption. While this bill sets a new federal floor for safety, it also means that if your state was working on a stronger law related to these specific online safety provisions, that state law might now be nullified. This centralizes the regulatory power with the FTC and could limit the ability of states to innovate or respond to local concerns. For users, it means the protections in this federal bill are likely the only ones you'll get on these specific issues.

The Privacy Question

One thing the bill doesn’t require is mandatory age verification for all users. Section 9 clarifies that the law doesn't require a platform to start collecting personal information about a user's age if it doesn't already do so in its normal business operations. However, since the most stringent requirements only kick in when a platform knows a user is a minor, platforms will be heavily incentivized to figure out who is who. This could lead to platforms adopting more aggressive, and potentially privacy-invasive, age estimation or verification methods to manage their legal risk, which could affect all users, not just minors.