PolicyBrief
S. 2997
119th CongressOct 9th 2025
Right to Override Act
IN COMMITTEE

This bill mandates policies ensuring healthcare providers retain final judgment over AI clinical support systems while establishing strong federal and state enforcement mechanisms and worker protections against retaliation for overriding AI recommendations.

Edward "Ed" Markey
D

Edward "Ed" Markey

Senator

MA

LEGISLATION

New 'Right to Override Act' Protects Healthcare Workers Who Reject AI Recommendations, Mandates $100k Fines for Retaliation

When the doctor tells you to take a pill, you trust their judgment, not the algorithm's. But as hospitals and clinics increasingly rely on Artificial Intelligence Clinical Decision Support Systems (AICDSS) to guide patient care, what happens when the human and the machine disagree? The Right to Override Act is the policy equivalent of putting the doctor back in charge, not the computer.

This bill sets up a clear system: if a healthcare professional (that’s anyone from a doctor to a nurse to a home health aide) thinks the AI is wrong, biased, or just not right for the patient, they have the protected right to override its suggestion. Crucially, the law defines "adverse employment action" broadly—it’s not just getting fired; it includes retaliatory investigations, bad scheduling, or even losing the ability to work from home. If you override the AI and follow the rules laid out in this bill, your job is protected from retaliation.

The Human Element: Why the Override Matters

For healthcare professionals, this bill is a game-changer for job security and clinical autonomy. Hospitals and clinics (called "Covered Entities") that use AICDSS must now implement policies that explicitly state the AI’s suggestions cannot replace the independent judgment of the professional (Sec. 101). Think of a nurse in a busy ER: if the AI recommends a treatment plan that ignores a patient’s rare allergy or a complex social factor the computer missed, the nurse can confidently overrule the system without worrying about getting written up. They just have to do it by the book, documenting why they did it.

The law also mandates the creation of an AICDSS committee within 120 days of the bill passing. This committee must include at least as many non-managers as managers, plus union reps, to review how the AI is performing. This means the people actually using the tool get a seat at the table to flag systemic issues, like an AI that consistently generates biased advice that requires frequent overrides (Sec. 101).

Data Privacy and the Digital Paper Trail

One major concern with AI is that every action is tracked. This bill addresses that by restricting how "override data"—information showing a professional ignored the AI—can be shared. Organizations cannot use this data to identify a specific professional or a small group of professionals for disciplinary action. The goal is to encourage clinical honesty without creating a digital hit list of people who dare to disagree with the algorithm. The only exceptions are when sharing information with the patient or during a formal legal proceeding, like a malpractice suit (Sec. 101).

Speaking of malpractice, the bill has a very important caveat: if a doctor overrides the AI and something goes wrong, this law does not protect them from medical malpractice or negligence lawsuits (Sec. 303). The protection is purely against the employer for retaliation; the liability for patient outcomes remains squarely on the human professional.

Big Fines and the Right to Sue

This bill has teeth, thanks to the Department of Labor (DOL) and the Department of Health and Human Services (HHS). Enforcement is split: HHS handles the policy side (Title I), while the DOL handles the worker protection and anti-retaliation side (Title II).

If a covered entity retaliates against a worker for overriding the AI (Sec. 201) or for whistleblowing (Sec. 202), the penalties are steep. The DOL can impose civil fines up to $769,870 for repeat offenses. But the most direct power lies with the individual: if you are harmed by a violation, you can sue the covered entity directly in federal court. Winning that lawsuit can grant you up to triple your lost wages, plus attorney fees, and significant statutory damages. For a violation of the whistleblower protections, the statutory damages range from $10,000 to $100,000 per violation (Sec. 203). This high floor for damages is meant to deter violations before they happen and gives workers a powerful tool if they are wronged.