The SCREEN Act requires online platforms that host or provide pornographic content to implement technology to verify users' ages and prevent minors from accessing harmful material.
Mary Miller
Representative
IL-15
The SCREEN Act requires online platforms that host or provide pornographic content to implement technology verification measures to prevent minors from accessing harmful material. It mandates that these platforms use reliable age verification methods, undergo regular audits by the Federal Trade Commission (FTC) to ensure compliance, and protect user data collected during the verification process. The FTC is responsible for enforcing the Act, and the Government Accountability Office (GAO) must report to Congress on the effectiveness and impact of these measures. This act aims to protect children from the harmful effects of online pornography.
The Shielding Children's Retinas from Egregious Exposure on the Net Act, or "SCREEN Act," aims to block minors from accessing online pornography. The core of the bill is a requirement for websites that host or distribute material deemed "harmful to minors" to implement age verification technology. This means sites dealing in adult content will need to confirm users are over 18, not just through a simple click-through, but using actual tech-based checks. (SEC. 4)
The bill mandates that within one year of enactment, "covered platforms"—basically any online service that regularly creates, hosts, or provides content considered harmful to minors for profit—must use technology to verify users' ages. (SEC. 3 & 4). This isn't just about checking a box; it requires a system that actively determines if someone is likely a minor and blocks them if they are. The law specifies that even users with virtual proxy networks (VPNs) will be subject to verification unless they're determined to be outside the U.S. Think of it like a digital bouncer checking IDs at the door, but for websites.
Imagine trying to access a site with adult content, and instead of just clicking "I'm over 18," you have to provide some form of digital proof. This could mean scanning a driver's license, using a facial recognition system, or some other yet-to-be-developed method. While the bill doesn't specify which technology must be used, it does require that the method is effective and the process is publicly available. (SEC. 4). This could impact anyone from a college student researching human sexuality to an adult enjoying legal content in their free time. It also directly affects online platforms, which will need to implement and pay for these new systems, potentially changing how these sites operate and how much they cost to use.
The law also mandates that platforms maintain data security to protect the information collected during age verification, keeping it only as long as necessary. (SEC. 4). The Federal Trade Commission (FTC) will be in charge of enforcement, conducting audits and issuing guidance, with penalties for non-compliance. (SEC. 6 & 7). A report by the Government Accountability Office (GAO) is also required within two years of implementation to analyze the effectiveness of these measures and their impact. (SEC. 8)
While the goal is to protect kids, the bill raises some real questions. The definition of "harmful to minors" is pretty broad, including depictions of nudity or sexual acts that lack "serious literary, artistic, political, or scientific value for minors." (SEC. 3). This could potentially lead to restrictions on content that some might consider educational or artistic. There's also the practical challenge of how effective these measures will actually be. The bill acknowledges that past attempts using filters have failed, and kids are often tech-savvier than adults. (SEC. 2). And of course, there's the big question of privacy: how will all this data be collected, stored, and protected from misuse? These are all points to keep a close eye on as this bill moves forward.