This Act establishes the Life Sciences Research Security Board to review and approve federal funding for high-risk life sciences research, including Dual Use Research of Concern and Gain of Function research, to protect public health and national security.
H. Griffith
Representative
VA-9
The Risky Research Review Act establishes the Life Sciences Research Security Board to oversee and approve federal funding for "high-risk life sciences research," which includes Dual Use Research of Concern (DURC) involving dangerous pathogens. This new independent agency will review proposals to determine if the potential risks to public health or national security outweigh the anticipated benefits before any funding is awarded. The Act mandates strict attestation requirements for applicants and outlines penalties for non-compliance or misrepresentation regarding high-risk research activities.
The Risky Research Review Act creates a powerful new federal agency—the Life Sciences Research Security Board (LSRSB)—with the authority to issue binding, final determinations on whether federal funding can be awarded for certain biological studies. Essentially, if your research involves certain pathogens or techniques, this new, nine-member Board gets the final say on your grant money, overriding the existing federal agencies like NIH or the Department of Energy.
This law is focused squarely on what it calls "High-Risk Life Sciences Research." That’s defined as either Gain of Function research (experiments that boost how easily a dangerous pathogen spreads or how sick it makes people) or research involving "Dual Use Research of Concern" (DURC) and a "High-Consequence Pathogen." The list of these pathogens is long, including things like Ebola, Marburg, and Sarbecoviruses (like SARS-CoV-2), and the Board can add more later. The goal is clear: prevent federally funded research from inadvertently creating a public health or national security disaster.
The Board is set up as an independent agency, staffed by nine experts—five scientists, two national security specialists, and one biosafety expert, plus an Executive Director—all appointed by the President. These members are subject to strict conflict of interest rules, and they can’t have worked for certain key federal agencies (like the Department of Defense or NIAID) for three years prior to their appointment. This structure aims to keep the Board impartial and focused solely on risk assessment, rather than agency agendas.
Here’s where it gets real for researchers: If you’re applying for federal money, you now have to sign an affidavit, under penalty of perjury, stating whether your proposed work qualifies as "high-risk." If you say yes, the agency must send your full proposal to the Board. If you say no, the agency still has to review your claim internally. Agencies are now banned from awarding funding for any research the Board is reviewing until the Board gives its explicit approval. They must notify the Board at least 30 days before they plan to award any money.
For researchers, this creates a new, mandatory hurdle that could significantly delay or kill projects. The Board gets up to 120 days to review a proposal. While the Board is tasked with weighing the potential benefits against the risks, its final decision is binding. Imagine a research team at a university that has spent a year developing a proposal to study a new vaccine mechanism against a dangerous flu strain. Even if NIH approves it, the LSRSB can now swoop in and veto the funding if they decide the risk of the research—say, working with a modified virus—outweighs the benefit. This power is centralized and final, effectively creating a powerful bottleneck for federally funded biological science.
What happens if a researcher knowingly lies on that mandatory attestation? The agency must refer them for suspension or debarment from receiving any federal funding, which is career-ending stuff. Even agency employees who fail to follow the new certification rules face discipline, including the permanent revocation of their security clearances. This shows the law is serious about compliance, but it also raises the stakes for researchers who may be unsure if their work technically falls under the bill’s broad definitions of “high-risk” or “DURC.”
There is an expedited review process for "emergency research" addressing immediate public health or national security threats, where the Board only gets 15 days to decide. If they don't make a determination in that short window, the funding agency can temporarily award the money. This is a necessary safety valve, but it highlights the potential for this new process to slow down critical responses.
Overall, the Risky Research Review Act is a massive regulatory shift. It attempts to solve a critical national security problem—the accidental or deliberate misuse of dangerous biological research—by imposing a powerful, centralized oversight body with the authority to veto scientific funding. While the intent to secure dangerous science is positive, the broad definitions and the binding nature of the Board’s power could introduce significant delays and administrative burdens, potentially chilling important scientific inquiry that ultimately benefits public health.