PolicyBrief
H.R. 6257
119th CongressNov 21st 2025
Safe Messaging for Kids Act of 2025
IN COMMITTEE

This Act establishes federal requirements for social media platforms to implement parental controls for minors' direct messaging, prohibits ephemeral messaging for minors, and mandates app store warnings, all while preserving end-to-end encryption.

Neal Dunn
R

Neal Dunn

Representative

FL-2

LEGISLATION

Social Media Bill Mandates Parental Control Over Teen DMs, Bans Ephemeral Messaging for Users Under 17

The newly proposed Safe Messaging for Kids Act of 2025 (SMK Act) is looking to drastically reshape how minors—defined here as anyone under 17—use social media direct messaging (DM) features. Essentially, this bill forces social media platforms to hand over significant control of a minor's private conversations to their parents, while simultaneously pulling the plug on disappearing messages for that age group.

The End of Disappearing Messages for Teens

Let’s start with the feature that’s getting axed: ephemeral messaging. Section 3 of the SMK Act flat-out prohibits social media platforms from offering any messaging feature that automatically deletes or makes a message inaccessible after a set time or viewing. If a platform knows or should know a user is under 17, those disappearing messages are gone. For platforms, this means reworking features like 'view once' photos or timed chats for their younger audience. For a 16-year-old, it means anything they send in a direct message is now permanently stored, which significantly changes the dynamic of online communication.

Mandatory Parental Controls: What Parents Can Do

The core of the bill, outlined in Section 4, requires platforms to build robust, easily accessible Parental Direct Messaging Controls for minors. These aren't optional features; they must be provided if the platform offers DMs. Once a parent provides verifiable parental consent (the same standard used in COPPA), they gain a few key powers.

First, a parent can receive a timely notification when an unapproved contact tries to message their child, allowing them to approve or deny the request before any conversation starts. Think of it like a digital gatekeeper for their kid’s inbox. They can also manage a full list of approved contacts and, if they choose, disable all direct messaging features on the child’s profile entirely. Crucially, for users under 13, all direct messaging must be disabled by default, requiring a parent to actively enable it.

This is a huge change for platforms, which must now integrate these controls without degrading any other features of the app—meaning the child still gets the full social media experience, just with tightly managed DMs. It also puts the onus on platforms to prevent the minor from easily circumventing these controls, which is a technical challenge that could lead to more age verification requirements down the line.

The Encryption Question and the Preemption Clause

Here’s where the bill gets interesting for privacy advocates and tech companies. Section 7 includes strong language protecting encryption. The Act cannot be interpreted to require platforms to decrypt communications, prevent the use of strong encryption (like end-to-end encryption), or design features that weaken security. The bill clearly states that parental controls must be implemented “to the maximum extent technically feasible” without compromising the integrity of strong encryption. This is a vital provision that aims to ensure child safety measures don't accidentally create a backdoor for surveillance.

However, Section 8 drops a significant regulatory bomb: Federal Preemption. This federal law voids and supersedes all existing state and local laws, rules, or regulations that cover the same subject matter. If your state or city already had stricter rules about social media use for kids, those laws are now dead. While the FTC and state Attorneys General can enforce this new federal law, states lose the ability to create new, localized protections or tailor existing ones.

Rollout and Enforcement

Platforms have a phased timeline for compliance. They get one year from enactment to implement the complex parental control features (Section 4), and 18 months to comply with the app store warning requirements (Section 5). Enforcement falls to the FTC, which can treat violations as unfair or deceptive acts, leading to significant penalties. State Attorneys General also gain the power to file civil lawsuits to stop violations and seek damages for their residents, provided they notify the FTC first.