Sammy's Law mandates that large social media platforms must provide secure, real-time access to third-party safety software chosen by parents to monitor and manage their children's accounts.
Debbie Wasserman Schultz
Representative
FL-25
Sammy's Law aims to empower parents by requiring large social media platforms to provide secure access for third-party safety software to monitor and manage their children's accounts. This legislation establishes a national standard for this specific data access, while granting enforcement authority to the Federal Trade Commission (FTC). The law mandates strict data handling and deletion requirements for the third-party safety providers who register with the FTC.
This new legislation, dubbed Sammy’s Law, is a direct response to growing concerns about kids’ safety online, specifically on major social media platforms. The bill’s core purpose is clear: empower parents with external tools to monitor and manage their children’s accounts. It does this by forcing large social media companies to open up a digital back door—what the bill calls an Application Programming Interface, or API—to approved third-party safety software.
First, let’s define the playing field. A “child” is anyone under 17 with an account. A “large social media platform” isn’t every app out there; it’s only the giants—those with over 100 million global monthly users or annual revenue exceeding $1 billion. Think of the platforms where your kids are sharing photos and videos with strangers they met only on that service. Within 30 days of this law taking effect (which happens only after the FTC issues its guidance), these platforms must create and maintain APIs that allow registered safety software to securely transfer a child’s user data. Crucially, this data transfer must happen at least once every hour, in a computer-readable format (SEC. 4).
For a parent, this means if you install an approved safety app, that app gets near real-time access to the content being created or sent to your child’s account. This allows the safety software to manage settings and flag potential issues like cyberbullying, self-harm indicators, or exposure to illegal content—the kind of real-world dangers Congress explicitly cited (SEC. 2).
While the goal is safety, the mechanism involves significant data access. The third-party safety software providers are subject to strict rules to handle this sensitive access. They must register with the FTC, promise to use the data only for child protection, and process and store all user data exclusively on hardware located within the United States. Furthermore, they have a tight 14-day window to delete the data they collect, unless they are sharing it with the parent/guardian or law enforcement under specific exceptions (SEC. 4).
This is a massive undertaking for these safety companies, requiring annual independent security audits to prove compliance. If they mess up, the FTC can suspend or permanently de-register them. For parents, this framework is intended to ensure that the tools they use are secure and accountable. For the child, however, it means any content they create or receive is subject to near-constant, hour-by-hour monitoring by an external entity authorized by their parent.
In a move that centralizes power, Sammy’s Law establishes one national standard, overriding any state or local laws that might try to mandate similar API access for safety software (SEC. 6). This simplifies compliance for platforms but removes the ability of states—like California or New York—to innovate or create stronger local rules regarding this specific type of mandated access. States can still enforce existing laws on fraud, contracts, and general consumer protection, but they are barred from setting their own rules on this API requirement.
Enforcement falls squarely on the Federal Trade Commission (FTC). Any violation by a platform or a safety provider will be treated as an unfair or deceptive business practice (SEC. 5). The FTC must issue clear guidance within 180 days of the law’s enactment, and the entire law only takes effect after that guidance is released (SEC. 7). This means the FTC’s initial instructions will determine how smoothly this system rolls out and how effectively it balances child safety with privacy concerns.