The SCREEN Act requires online platforms that host or provide pornographic content to use technology to verify users' ages, preventing minors from accessing harmful material.
Mike Lee
Senator
UT
The SCREEN Act requires online platforms that host or provide pornographic content to use technology to verify users' ages, preventing minors from accessing harmful material. Platforms must implement age verification measures within one year of the Act's enactment, prioritizing data security and user privacy. The Federal Trade Commission (FTC) will oversee compliance, conduct regular audits, and provide guidance to platforms, while the Comptroller General will report to Congress on the Act's effectiveness and impact. This act aims to protect children from the harmful effects of online pornography by ensuring only adults can access such content.
The "Shielding Children's Retinas from Egregious Exposure on the Net Act," or SCREEN Act, aims to block minors from accessing online pornography. The core of the bill? Requiring websites that host or provide pornographic content—termed "covered platforms"—to use technology to verify users' ages. (SEC. 4). This isn't just about checking a box; it means implementing actual tech solutions to confirm you're not underage. (SEC. 3).
The SCREEN Act puts the Federal Trade Commission (FTC) in charge of making sure these platforms follow the rules. (SEC. 7). Think regular audits and potential penalties for those who don't comply. (SEC. 6, SEC. 7). The bill mandates that within one year of enactment, all "covered platforms" must have these age verification systems in place. (SEC. 4). This could mean anything from scanning your driver's license to using facial recognition software—the bill leaves the specifics up to the platforms, as long as it's effective. (SEC.4). The act defines "technology verification measure" as using technology to determine if a user is likely a minor and prevents minors from accessing certain content. (SEC. 3). For example, imagine a college student trying to access a site that's flagged. If the verification system works, they're blocked. If it fails, they get access to content they shouldn't see.
While the goal is to protect kids, the SCREEN Act raises some serious questions about online privacy. The bill requires platforms to collect and handle "technology verification measure data"—information used to verify your age. (SEC. 3). While the bill says this data should only be used for verification and kept secure, the potential for misuse or data breaches is a real concern. (SEC. 4). Think about it: every time you access one of these sites, your data is being checked and potentially stored. The bill also requires the FTC to consult with experts in various fields, including online privacy and data security, suggesting an awareness of these concerns. (SEC. 5).
The bill specifically points out that blocking and filtering software—the current go-to solution—haven't been effective. (SEC. 2). It cites studies showing that kids can easily bypass these filters and that many parents don't even use them. (SEC. 2). This law is intended to be a more robust solution, but it also acknowledges the challenges. A report is due to Congress within two years of implementation, analyzing the effectiveness, compliance, data security, and even the behavioral and economic impacts of these age verification measures. (SEC. 8). This means we'll get a check-up on whether this whole thing is actually working and what unintended consequences it might have.