This bill establishes the Kids Internet Safety Partnership within the Department of Commerce to identify online risks and best practices for minors and publish guidance for online service providers.
Russell Fry
Representative
SC-7
The Kids Internet Safety Partnership Act establishes a temporary partnership within the Department of Commerce to study online risks and benefits for minors. This group will coordinate with experts and stakeholders to develop best practices for online platforms. The Partnership is tasked with publishing public reports and a playbook to guide providers in implementing safety measures, including age verification and parental tools.
The Kids Internet Safety Partnership Act establishes a five-year, temporary task force within the Department of Commerce, called the Kids Internet Safety Partnership. This isn't a bill that instantly bans a specific app or feature; instead, its main job is to coordinate research and create a blueprint for how online platforms should handle users under 18.
The Partnership’s core mission is to spend the next five years figuring out the good, the bad, and the practical when it comes to minors online. Specifically, Section 2 requires them to coordinate with a wide range of stakeholders—from academic experts and researchers to parents, minors, and even civil liberties experts—to identify the risks and benefits of websites, apps, and services for kids. This group is tasked with finding "evidence-based best practices" for different age groups.
Within two years, the Partnership must publish a public "playbook" for online service providers and developers. Think of this playbook as the industry standard for making platforms safer. It will cover everything from better age verification (or estimation) systems to specific requirements for parental tools and default privacy settings. For the rest of us, this means that if this bill works, platforms will eventually have a clear set of rules for how they need to protect kids, rather than just guessing.
One of the most interesting parts of this bill is how it defines what it calls a "Design feature." These are the platform elements specifically engineered to keep users hooked. The bill explicitly lists examples like "infinite scrolling," "auto-play," "notifications," and even "appearance-altering filters." The Partnership’s playbook is required to facilitate the implementation of best practices related to these features, including limitations and opt-outs.
For a parent, this could mean that the platform your kid uses might be required to offer limits on how long they can scroll or watch videos before the app automatically stops. For developers, this means the days of prioritizing engagement above all else might be numbered, at least when minors are involved. The bill also requires the playbook to address parental tools, making sure parents can easily view metrics on time spent, change privacy settings, or restrict purchases, all based on the definition of "Parental tool" in Section 2.
While the bill doesn't impose fines or mandate specific changes right now, it creates the foundation for future regulation. Online platforms and developers will feel the pressure to comply with the forthcoming "playbook," which will likely lead to compliance costs and required software adjustments. For example, implementing better age assurance systems or overhauling recommendation algorithms to limit personalization for minors (as the bill suggests) takes significant time and money.
Crucially, the Partnership must coordinate with experts in constitutional law and free expression. This is important because any attempt to limit features or require age verification inevitably runs into questions about privacy and civil liberties. The bill requires the Partnership to address limitations and opt-outs for "personalized recommendation systems and chatbots," which is a direct nod to concerns that targeted content can be harmful to minors. By requiring input from both tech companies and civil liberties advocates, the bill attempts to strike a balance between safety and maintaining an open internet. However, the broad definition of "Design feature" could be a point of contention, as platforms might argue that features like notifications are necessary, not just manipulative.