PolicyBrief
S. 3292
119th CongressDec 1st 2025
Platform Accountability and Transparency Act
IN COMMITTEE

This Act establishes a framework for independent researchers to study large online platforms, mandating data transparency from platforms while providing legal protections for both researchers and platforms.

Christopher Coons
D

Christopher Coons

Senator

DE

LEGISLATION

FTC Mandates Algorithm Transparency and Data Sharing for Platforms with Over 50 Million Users

The Platform Accountability and Transparency Act is essentially a mandate for the largest social media and content platforms—think the ones with over 50 million monthly U.S. users—to open up their data and their algorithmic black boxes to outside scrutiny. This isn't just about privacy; it’s about making sure we know how these digital giants are shaping public discussion and who’s benefiting from their systems. It breaks down into two major parts: creating a formal pipeline for academic research and forcing platforms to be publicly transparent about their operations.

The VIP Pass for Researchers: Data Access, Managed by the Feds

This bill sets up a new system run jointly by the National Science Foundation (NSF) and the Federal Trade Commission (FTC) to allow “qualified researchers”—academics from U.S. universities or non-profits—to access platform data (Sec. 3). If you’re a qualified researcher with an approved project, the platform is required to hand over the “qualified data and information” needed for your study. This data is strictly defined; it excludes private messages, biometric data, and precise location information, focusing instead on data necessary to understand platform activity.

For platforms, this means a significant new obligation. They can appeal a research request, but only on narrow grounds, like a major security vulnerability or undue burden (Sec. 3). For users, the FTC must issue rules requiring platforms to inform you about these data-sharing practices (Sec. 4). The good news is that platforms get legal protection (a “safe harbor”) from lawsuits if they share data under this approved system, provided they follow the rules (Sec. 4). The catch for researchers is that the FTC/NSF decision to approve or deny a project is not subject to judicial review (Sec. 3). That’s a lot of power concentrated in the agencies, and it means if a researcher feels they were unfairly denied, they have no recourse in the courts.

The Journalist’s Shield: Protecting Public Interest Snooping

Section 8 is a game-changer for journalists and public interest researchers who investigate platforms. It creates a legal safe harbor that protects them from being sued by a platform for violating its Terms of Service when they collect publicly available information using automated tools or temporary research accounts. If you’re a journalist or researcher investigating something “of public concern”—like how misinformation spreads or how political ads are targeted—and you take “reasonable measures to protect individual privacy,” the platform can’t sue you for collecting that data.

This provision is crucial because platforms often use their Terms of Service as a weapon to shut down independent scrutiny. By federal law, this bill says collecting public data for public interest research is no longer considered “accessing the platform without authorization” (Sec. 8). However, there’s a specific carve-out: you can’t use this safe harbor to collect data for training a large language model. This section essentially protects the digital muckrakers, ensuring they can investigate what’s happening on major platforms without fear of a lawsuit.

Shedding Light on the Algorithm

The most visible change for the public comes from the massive new transparency requirements the FTC must establish within one year (Sec. 9). This is where the platforms have to start showing their homework.

1. Content and Account Transparency: Platforms must create a public, searchable repository—accessible via their website and an API—for “highly disseminated” content (content viewed by at least 10,000 users) and content from “major public accounts” (those reaching 25,000+ users monthly). This repository must show not just the content, but also whether the platform’s algorithms recommended, amplified, or restricted it (Sec. 9). This is huge. If you’ve ever wondered why a certain post went viral, this data should offer some answers.

2. Ad Transparency: Platforms must also maintain a public ad repository showing the content of the ad, who paid for it, and whether it targeted specific groups. It also requires disclosure of whether the ad was amplified or restricted by platform algorithms (Sec. 9).

3. Algorithm and Moderation Reports: At least twice a year, platforms must publicly report on their recommender and ranking algorithms. This includes summarizing the inputs (like user data) that feed the algorithms, their optimization goals, and how they assess performance (Sec. 9). The FTC will need to walk a fine line here, ensuring platforms provide useful information without revealing actual trade secrets.

What This Means for You

This bill doesn't directly change how you use your favorite app, but it fundamentally changes the amount of information available about that app. If you’re a consumer, the goal is to create a more informed public discourse by allowing independent experts to study how these powerful systems affect society. If you run a small business that relies on platform advertising, the new ad transparency requirements could provide valuable insight into how targeting actually works.

However, it also means a lot of new regulatory overhead for the platforms, which typically gets passed along in some form. And while the bill is all about transparency, it's worth noting that researchers affiliated with law enforcement or intelligence agencies are explicitly excluded from the qualified research program (Sec. 2). This means that while academics can study these platforms, federal agencies investigating national security threats or criminal activity won't have this formal, protected data access channel.