PolicyBrief
H.R. 2385
119th CongressMar 26th 2025
CREATE AI Act of 2025
IN COMMITTEE

The CREATE AI Act of 2025 aims to democratize AI research and development by establishing a National Artificial Intelligence Research Resource (NAIRR) that provides broader access to computational resources, data, and educational tools for researchers and students across the U.S.

Jay Obernolte
R

Jay Obernolte

Representative

CA-23

LEGISLATION

CREATE AI Act Proposes National AI Hub: Aims to Give Researchers and Students Access to High-Powered Resources

The CREATE AI Act of 2025 lays the groundwork for a new National Artificial Intelligence Research Resource, or NAIRR. Think of it as a plan to build a publicly accessible powerhouse for AI development. The core idea, outlined in Section 3, is to give researchers, educators, and students across the U.S. better access to the expensive computing power, massive datasets, and specialized tools needed for cutting-edge AI work – resources often locked up within big tech companies. The National Science Foundation (NSF) is tasked with getting this NAIRR operational within a year of the bill's passage.

Building the AI Sandbox

So, how does this actually get built and run? The bill sets up a multi-layered structure. A dedicated Program Management Office within the NSF will handle the day-to-day, staffed by at least three full-timers (Section 3). They'll oversee things like funding opportunities and selecting a non-governmental Operating Entity to manage key functions like a user portal and infrastructure modernization. High-level direction comes from a Steering Subcommittee, chaired by the White House's Office of Science and Technology Policy Director, coordinating strategy and budget (Section 3 amending the NDAA for FY21). Input will also come from Advisory Committees pulling expertise from government, industry, academia, and public interest groups.

Who Gets Access and What's Inside?

Eligibility is focused on U.S.-based folks: researchers, educators, and students connected to universities, non-profits, government agencies, federally funded research centers, and even small businesses that have snagged federal funding (Section 3). There's a notable exclusion for individuals working for or on behalf of specific foreign countries listed in existing law (referencing 10 U.S.C. 4872(d)(2)).

What resources are we talking about? The NAIRR aims to provide a mix, including:

  • Computational Power: Access to different types of computing resources.
  • Data: Curated datasets and standards to make data work together better.
  • Educational Tools: Training materials and outreach programs.
  • AI Testbeds: Secure environments specifically designed for researchers to test, measure, and evaluate AI systems.

Access won't necessarily be entirely free. While a free tier funded by appropriations is planned, the bill allows for a fee schedule for users needing more resources, potentially based on cost. Private donations of cash, services, or property are also welcomed (Section 3).

Rules of the Road: Ethics, Security, and Hurdles

The bill puts guardrails in place. Applications to use NAIRR resources will undergo reviews focusing on privacy, ethics, safety, security, and the trustworthiness of the AI being developed (Section 3). In fact, projects specifically targeting these areas get priority access to computing resources. Users will have to meet minimum security requirements based on NIST standards, and the whole operation needs to align with federal policies on research security (like National Security Presidential Memorandum 33).

While the goal is democratization, practical challenges remain. Standing up a complex resource hub like NAIRR efficiently without getting bogged down in bureaucracy is a tall order. Ensuring fair access across different types of institutions and defining concepts like 'trustworthiness' consistently will be key tests. The effectiveness hinges on smooth coordination between the NSF, the steering committee, the eventual operating entity, and the various agencies contributing resources.