PolicyBrief
S. 4159
119th CongressMar 20th 2026
Sammy’s Law
IN COMMITTEE

Sammy’s Law requires large social media platforms to provide parents and guardians with tools and API access to third-party safety software to monitor and manage their children's online accounts.

Jon Husted
R

Jon Husted

Senator

OH

LEGISLATION

Sammy’s Law Mandates Parental Access to Kids' Private Messages on Major Social Platforms

Large social media companies with over 100 million users or $1 billion in revenue will soon be required to hand over the keys to children’s digital lives. Under a new proposal called Sammy’s Law, platforms must provide parents and guardians with free tools to monitor any user under age 17. This isn't just a basic 'screen time' limit; the bill specifically mandates that parents be able to read all private messages, view every photo or video sent or received, and even change account privacy settings remotely. These features must be available within 180 days of the law taking effect, fundamentally shifting the boundary between a teenager’s private digital space and parental oversight.

The Digital Skeleton Key

To make this happen, the bill requires tech giants to build specialized 'APIs'—basically digital bridges—that allow third-party safety software to plug directly into a child’s account. If you’re a parent using one of these apps, the platform must sync your child’s data at least once every hour. This means if a 16-year-old is messaging a friend or posting a video, that content is funneled to the monitoring tool almost in real-time. While the goal is to catch red flags like cyberbullying or exploitation, the bill is explicit: parents get access to all multimedia and messages (Section 1), creating a level of transparency that leaves zero room for digital privacy for minors.

Outsourcing Oversight

Because most parents don't want to spend all day refreshing a portal, the bill leans heavily on third-party 'safety software providers.' These companies must register with the FTC and undergo annual audits to prove they aren't selling your kid’s data or working for 'covered nations' (foreign adversaries). For a family managing a busy schedule, this might look like an app that pings your phone only when it detects 'harm'—which the bill defines as anything from drug talk to signs of depression. However, this creates a massive new honey pot of sensitive data. If one of these third-party providers gets hacked, the 'intimate' data of millions of kids—messages they thought were private—could be exposed.

One Rule to Rule Them All

Perhaps the most significant 'fine print' is the national standard clause in Section 5. This provision stops states or cities from passing their own, potentially tougher, social media safety laws. If a state like California or Florida wanted to require even stricter data protections or different safety features, they’d be blocked by this federal ceiling. While this makes life easier for the tech companies—who only have to follow one set of rules—it means local voters lose the ability to pressure their state reps for faster or more specific changes to how their kids are protected online.