PolicyBrief
H.R. 4695
119th CongressJul 23rd 2025
Facial Recognition Act of 2025
IN COMMITTEE

The Facial Recognition Act of 2025 establishes federal regulations requiring court orders for most law enforcement use of facial recognition, mandates accuracy and bias testing, and requires public reporting to protect civil liberties.

Ted Lieu
D

Ted Lieu

Representative

CA-36

LEGISLATION

Facial Recognition Bill Sets Strict Court Order Rule for Police Searches, Mandates $50K Payout for Privacy Violations

The newly introduced Facial Recognition Act of 2025 is a major federal attempt to put the brakes on how law enforcement uses face-scanning technology. Essentially, this bill says that police can’t just run your driver’s license photo through a facial recognition system whenever they feel like it; they generally need a judge’s sign-off first. It’s a huge shift toward treating facial recognition searches more like physical searches, requiring probable cause and a court order (SEC. 101).

The Warrant Requirement: Your Face is Now Protected

For most searches against a reference photo database (like the one holding your DMV picture), a prosecutor must first get approval from their agency head, and then convince a judge that there’s probable cause you committed a serious crime (SEC. 101(c)). This is the bill’s core protection. If you’re a regular person going about your day, this means law enforcement can’t use powerful databases to track your face unless they can prove to a court they need to. The bill also requires DMVs to post clear notices explaining how police access their photo data, so at least you’ll know the rules of the road (SEC. 101(d)).

There are a few emergency exceptions—like identifying a deceased or incapacitated victim, or during an AMBER Alert—but even in high-risk situations, the prosecutor must seek judicial approval within 12 hours. If they don't, or if the judge denies it, any evidence collected must be destroyed (SEC. 101(e)). If you’re arrested, however, they can run a facial scan during booking without a warrant.

Cleaning Up the Mugshot Pile and Banning Surveillance

One provision directly addresses the fairness issue in existing databases. Within 180 days, and every six months thereafter, agencies must purge photos from their arrest databases if the person was a minor and released without charges, or if the charges were dropped, dismissed, or resulted in an acquittal (SEC. 101(b)). This is a big deal because it stops innocent people, or those whose cases were thrown out, from having their old mugshots used in new investigations. However, this only stops the photo from being used for facial recognition searches; the photo itself might still sit in other police databases.

The bill also sets hard boundaries on surveillance. Police cannot use facial recognition to track people exercising their First Amendment rights, like participating in a protest or a political rally (SEC. 102(a)). Furthermore, they are explicitly banned from running facial recognition on footage captured by body cameras, dash cameras, or drones (SEC. 102(b)). This prevents police from retroactively identifying everyone who appeared on bodycam footage during a routine traffic stop or incident.

The Accuracy Mandate: If It’s Biased, You Can’t Use It

This bill demands that the tech actually works, and works fairly. Every system used by law enforcement must be tested annually by the National Institute of Standards and Technology (NIST) to check not only its overall accuracy but also if it performs differently based on race, ethnicity, gender, or age (SEC. 106). If a system doesn't meet the “sufficiently high level of accuracy” standard set by the Assistant Attorney General, police can’t use it. This is crucial because studies have shown these systems often misidentify women and people of color at higher rates. The bill also mandates independent operational testing to see how the system performs in the real world, not just in a lab.

For law enforcement agencies, this means a massive new administrative load. They must log every search (SEC. 103), report detailed usage statistics—including resulting arrests and convictions broken down by race and gender—to the public every year (SEC. 104), and submit to annual audits (SEC. 105). Failure to comply can result in a state losing 15% of certain federal crime control grants (SEC. 2).

The Bottom Line: Your Right to Sue

Perhaps the biggest change for the average person is the new enforcement mechanism (SEC. 107). If an agency violates this Act—say, they track you illegally or use a system that hasn't passed the required accuracy tests—you can sue them. The minimum statutory damage award is $50,000 per violation, plus attorney’s fees. This creates a significant financial incentive for agencies to follow the rules, as an illegal sweep could quickly generate millions in liability. Furthermore, you can sue if the technology used results in a disparate impact based on race, ethnicity, gender, or age, meaning agencies can be held accountable even if they didn't intend to discriminate, but the system's design caused unfair outcomes.