This bill mandates a Federal Trade Commission report detailing how minors access fentanyl through social media and recommending actions to prevent it.
Gabe Evans
Representative
CO-8
The "No Fentanyl on Social Media Act" mandates the Federal Trade Commission to investigate and report on how minors access fentanyl through social media platforms. This report, developed with key agencies, will detail the prevalence, risks, and methods used by sellers to distribute fentanyl to minors online. Ultimately, the legislation seeks recommendations for Congress on effective measures to curb this dangerous access.
The “No Fentanyl on Social Media Act” is pretty straightforward: it mandates a deep-dive investigation into how minors—anyone under 18—are getting access to fentanyl and related substances through social media platforms. Specifically, it requires the Federal Trade Commission (FTC) to team up with the Department of Health and Human Services (HHS) and the Drug Enforcement Administration (DEA) to produce a comprehensive report within one year. This isn’t just a simple survey; it’s a detailed look at a serious public health crisis, focusing on the mechanics of how these dangerous drugs move from online sellers to kids.
This report aims to pull back the curtain on the digital drug trade. The FTC, working with its partners, must analyze several key areas. First, they need to figure out how common it is for minors to access fentanyl online and how easy it is to do so. Think of it as mapping the user journey for a kid trying to buy drugs. Second, the report must detail the specific ways drug sellers use platform design features—like direct messaging, ephemeral content, or algorithmic recommendations—to market, sell, deliver, and distribute fentanyl. This is critical because it forces a look at how the platforms themselves might be unintentionally facilitating illegal activity.
For social media companies, this bill means serious scrutiny. The report is required to evaluate the current practices and policies platforms have in place to stop drug sales and, crucially, how effective those measures actually are. If you’re a parent, this is the section that matters most: are the companies doing enough, or are their current safety measures just digital window dressing? The FTC must consult with parents, law enforcement, and the platforms themselves to get a full picture. The ultimate goal is to provide Congress with concrete recommendations on how to stop this access, which could lead to future regulation for social media companies.
While the report is intended for public consumption, there’s a necessary caveat: the FTC, after consulting with the Attorney General, can redact (or black out) information related to law enforcement tactics or techniques. This is meant to protect ongoing investigations and operational security, but it’s worth noting that it could also potentially obscure certain details about agency coordination or lack thereof. The bill clearly defines a “social media platform” as a public-facing website or app that primarily provides a forum for user-generated content, specifically excluding services like email or basic internet service providers. This focus keeps the spotlight squarely on sites where user interactions drive the content, making sure the investigation targets the actual marketplaces where these transactions are happening. For everyone concerned about the safety of kids online, this report is the first step toward getting real data to drive real solutions. It’s about replacing guesswork with facts.