PolicyBrief
S. 3269
119th CongressNov 20th 2025
Liquid Cooling for AI Act of 2025
IN COMMITTEE

This bill establishes a process for studying and assessing liquid cooling technologies for data centers and AI systems, requiring a comprehensive review by the GAO and subsequent assessment by the Department of Energy.

Dave McCormick
R

Dave McCormick

Senator

PA

LEGISLATION

New Act Mandates 90-Day Federal Review of Liquid Cooling Tech to Keep AI Energy Use from Crashing the Grid

The Liquid Cooling for AI Act of 2025 is the government’s acknowledgment that the AI boom is creating a massive power drain. Simply put, this bill is a federal mandate to figure out how to cool down the massive data centers running AI—before they suck up all the electricity.

Congress found that data centers, the literal engine rooms for AI and cloud computing, are using a rapidly increasing amount of U.S. electricity—jumping from 1.9% in 2018 to 4.4% in 2023, with projections showing it could hit nearly 13% by 2028. Why? AI chips run hot, and the old-school air conditioning isn't cutting it. This bill initiates a high-level, fast-tracked study into liquid cooling—the technology where computers are cooled by fluids instead of air—to see if it can save power and keep the AI industry from overheating.

The Energy Bill for AI

This isn't a bill about building new infrastructure; it's about studying how to make the existing tech more efficient. The key action here is the order given to the Government Accountability Office (GAO). Within just 90 days of the bill becoming law, the GAO must deliver a comprehensive report to Congress and the Department of Energy (DOE).

This report needs to cover everything: the costs and benefits of liquid cooling, how much energy it would save (including deferred upgrades to the electric grid), and how it stacks up against current air cooling. Crucially, the GAO must compare different liquid cooling methods—like Direct-to-chip (where coolant touches the processor) versus Immersion cooling (where the whole server is dunked in fluid). For the average person, this study is vital because if data centers can't manage their power consumption, that cost gets passed down to consumers through higher utility bills, and the strain on the grid could lead to reliability issues.

The Quest for Waste Heat

One of the most interesting parts of this review focuses on heat-reuse. Liquid cooling systems generate high-quality waste heat, and the bill requires the GAO to survey existing opportunities to capture this heat and use it for something else—like heating buildings or industrial processes. This is where the policy meets the pocketbook. If a data center can sell its waste heat, that's a new revenue stream that could offset operating costs, potentially slowing the rise of cloud service prices for small businesses and tech startups.

To ensure the report is grounded in reality, the GAO must consult with an advisory committee composed of industry experts, including hardware manufacturers, data center operators, and fluid producers. This is a smart move to get real-world input, but it’s also an area to watch: the composition of this committee will determine whose interests—energy savings or vendor profits—get the loudest voice in the recommendations.

Federal Action and Tight Deadlines

After the GAO delivers its report, the Department of Energy has 180 days to issue its own assessment. The DOE's job is to use the GAO findings to develop recommendations for R&D and advise Congress on how liquid cooling impacts the U.S.'s ability to maintain its global leadership in AI technology. The tight deadlines—90 days for the GAO, 180 days for the DOE—show the urgency the government places on this problem. However, conducting a deep, comprehensive analysis of a rapidly evolving technology and its complex market in just three months is a tall order. The speed could mean some nuances get overlooked.

Ultimately, the Liquid Cooling for AI Act doesn't change anything immediately for consumers or businesses, but it sets the stage for future regulation and federal investment. It's the policy equivalent of hitting the brakes and checking the engine temperature before a long, fast road trip. The goal is clear: find a way to power the future of AI without overloading the electric grid we all rely on.