This bill mandates that online platforms provide minors with clear notice and default access to an input-transparent algorithm instead of personalized recommendations.
Katherine "Kat" Cammack
Representative
FL-3
The Algorithmic Transparency and Choice Act requires online platforms to provide minors with clear notice about their use of personalized recommendation systems. It mandates that platforms default minors to an input-transparent algorithm, which does not use their personal data for content selection. Furthermore, the bill requires platforms to offer minors easy options to understand and modify how content is recommended to them. The Federal Trade Commission is tasked with enforcing these new transparency and choice requirements.
If you’ve ever wondered what exactly platforms like TikTok or Instagram are showing your kid—and why—this bill tries to pull back the curtain. The Algorithmic Transparency and Choice Act aims to change how covered online platforms use their personalized recommendation systems (the algorithms) when serving content to users under 18, officially defined as minors.
Starting one year after this bill becomes law, platforms that use personalized recommendation systems must make some serious changes for their younger users. The core change is a mandated default setting: minors must be automatically set up with an “input-transparent algorithm.” What does that mean in plain English? It’s a content feed that doesn’t use inferred, user-specific data—like your browsing history, device usage, or what the platform thinks you like based on past behavior—to choose what you see. It only uses data the minor expressly provides, like search terms, saved preferences, or current location. Think of it as a feed that’s less about prediction and more about your direct input.
Platforms must also provide clear, conspicuous notice when a minor first interacts with a personalized system, and they have to detail exactly how their recommendation systems work in the terms and conditions. This includes describing the system’s features, what user data it collects, and what the system is designed to optimize (like watch time or engagement). Crucially, the bill requires platforms to give minors an easy option to switch between the personalized system and the new input-transparent default, as well as the ability to limit specific types of recommendations.
This bill is a clear win for parental peace of mind and data privacy for minors, but there are two significant caveats baked into the text. First, the bill explicitly states that platforms do not have to disclose information that is considered a trade secret or confidential business information. Since the algorithm itself is often the most valuable intellectual property a platform owns, this exemption could create a massive loophole. Platforms might argue that the most meaningful details about how the algorithm works are protected, potentially diluting the transparency requirements for everyday users.
Second, the bill includes a preemption clause stating that no state or locality can establish or enforce a law covering the same requirements. This means that if a state or city wanted to pass a stronger, more protective law regarding children's data or algorithmic transparency, they would be blocked by this federal legislation. While federal consistency is good for platforms, it takes away the ability of local governments to respond to unique concerns or move faster on protecting their residents.
Enforcement of these new rules falls to the Federal Trade Commission (FTC), treating any violation as an unfair or deceptive act. For parents and minors, this means there’s a federal agency tasked with ensuring platforms actually comply with the default settings and transparency requirements. For the platforms, it means major compliance costs and a complete overhaul of how they onboard and serve their youngest users. The effectiveness of this law will ultimately depend on the FTC’s ability to enforce complex technical requirements while platforms try to navigate the line between genuine transparency and protecting their proprietary algorithms.