This bill strictly limits the TSA's use, collection, and storage of passenger biometric data via facial recognition technology, mandating opt-in consent and prohibiting surveillance outside of security checkpoints.
Jeff Merkley
Senator
OR
The Traveler Privacy Protection Act of 2025 strictly limits the Transportation Security Administration's (TSA) use of facial recognition technology during airport security screening. This bill establishes a default ban on collecting or storing passenger biometric data, requiring explicit opt-in consent for general passengers and clear opt-out rights for Trusted Traveler members. Furthermore, it imposes strict data minimization and storage limits, and prohibits the use of this technology for surveillance outside of the screening process.
The Traveler Privacy Protection Act of 2025 is a major policy shift, essentially telling the Transportation Security Administration (TSA) to put the brakes on their facial recognition programs. The bill makes it the default rule that the TSA cannot capture, collect, store, or process any biometric information from passengers using facial recognition technology. If you’re a regular traveler, this means the government can’t build a permanent database of your face just because you went through airport security. The only exceptions are highly limited, focusing on verifying your ID against your face in real-time, or for specific functions related to Trusted Traveler Programs (TTPs).
This legislation splits the world into two groups: TTP members (like Global Entry) and everyone else. If you are not a member of a TTP, the TSA must get your affirmative express consent before every single use of facial recognition technology on you. Think of it like actively clicking ‘Yes, I agree’ every time you step up to the scanner. If you opt out, they must simply check your ID the old-fashioned way, and they are explicitly prohibited from penalizing you with extra screening or discrimination for saying no. For TTP members, the TSA can use facial recognition for identity verification, but they must provide clear notice and a guaranteed option to opt out without penalty when you sign up or renew, and also remind you at the checkpoint. This is huge: it puts the control back in the hands of the traveler, ensuring you can still fly without your face being scanned and logged.
The bill introduces strict data minimization rules, which is where the real privacy protection kicks in. If the TSA uses facial recognition to check your ID against the photo on that ID (a 1:1 match), they can only store your facial image long enough to complete the verification, meaning basically instantaneous deletion. If they are using technology to check your face against a database (a 1:N match—only allowed for TTP functions), they can only store that image for a maximum of 24 hours after your scheduled flight departure. This short timeline prevents the TSA from warehousing millions of facial scans for future use. Furthermore, the TSA must destroy any existing biometric data collected before the law was enacted if that data would violate these new storage rules.
Perhaps the clearest win for civil liberties is the explicit ban on using facial recognition for anything other than immediate identity verification at the screening location. The law states the TSA cannot use the technology to track or identify passengers outside of the screening area, nor can they use it for profiling, targeting, or wide-scale, indiscriminate monitoring. This means the cameras at the gate or baggage claim cannot be hooked up to identify you. It’s a hard line drawn against mission creep, ensuring the technology stays confined to its intended, narrow security purpose.
The legislation also sets up a much-needed accountability loop. It mandates that the Government Accountability Office (GAO) study the TSA’s use of facial recognition and report back to Congress annually. These reports must assess the technology’s effectiveness, its false positive/negative rates, and crucially, include research on bias broken down by age, race, ethnicity, and sex. This provision acknowledges that facial recognition technology often exhibits bias against certain demographics and forces the government to measure and address those issues publicly. For travelers, this means transparency on whether the technology is working fairly and effectively, moving beyond just security claims to look at real-world impact.