Liquid Cooling

Development of Liquid Cooled Standards

Liquid cooling is valuable in reducing energy consumption because the heat capacity of liquids is orders of magnitude larger than that of air and once heat has been transferred to a liquid, it can be removed from the data center efficiently.

LBNL is one of several industry experts currently participating in an initiative to develop a liquid cooled rack specification. The goal of the project is to develop a specification that could accommodate multiple vendors and provide a reusable infrastructure for multiple refresh cycles with a variety of liquid cooled servers/suppliers. The specification could include but is not limited to, fluid selection and quality, supply pressure, temperature, and flow, delta pressure and temperature, header size and material, connection spacing, size, and details. Learn more about the project.  

Liquid Cooled Technologies

Liquid cooling in data centers is implemented with a broad range of technologies. These technologies range from transferring heat to a liquid far from the source (e.g. computer room air handlers (CRAHs)) to immersion cooling where the heat transfer takes place on the surface of the electronic component.  In general, when the heat is transferred close to the source the cooling liquid supply can be warmer and still provide the needed cooling and rejecting the heat to the outside environment. Liquid cooling solutions that transfer heat near the source generally incur additional cost compared to air-cooled IT equipment in a standard rack. These additional costs may substantially offset by the improved efficiency and potential capital savings of a final solution that includes liquid cooling near the heat source.

The adjacent figure provides a number of liquid cooling technology types or categories and a sampling of companies that have provided or are providing solutions. LBNL researchers have reported on  energy savings estimates for a number of the technologies listed. A description, as well as related LBL research and resources, are provided for each of the technologies.

CRAH: 

Demonstration of Intelligent Control and Fan Improvements in Computer Room Air Handlers: Computer room air handler (CRAH) controls and improved CRAH fans were retrofitted into a colocation data center owned and operated by Digital Realty Trust. The overall yearly data center energy savings estimate was significant at 8% compared to before the retrofit.

OVERHEAD, INROW, ENCLOSED CABINET, REAR DOOR HEAT EXCHANGER, CONDUCTION: Liquid cooling in data centers can be configured in a variety of ways.  Enclosed systems can cool chips and components directly by immersing servers in cooling fluids or into the back a rack or cabinet. The RDHx device, which resembles an automobile radiator, is placed in the airflow outlet of a server rack. During operation, hot server-rack airflow is forced through the RDHx device by the server fans. Heat is exchanged from the hot air to circulating water from a chiller or cooling tower. Thus, server-rack outlet air temperature is reduced before it is discharged into the data center.

Demonstration of Alternative Cooling for Rack-Mounted Computer Equipment. In 2009, eleven different models of IT rack-level liquid cooling devices were compared in a demonstration named “Chill-Off 2”. The partial and overall data center efficiencies were evaluated using two different chilled water plant models and a number of different environmental conditions. The results show significant differences in energy efficiency. The Conduction and Rear Door Heat Exchanger technologies had the best results.

Demonstrating a Dual Heat Exchanger Rack Cooler “Tower” Water for IT Cooling: A prototype InRow cooling device with two heat exchangers from Schneider Electric (APC) was demonstrated to investigate potential energy efficiency advantages compared to a single heat exchanger design.  The results show a significant energy efficiency improvement when the cooling tower and chilled water supplies are available near the rack. The dual heat exchanger approach acts as a localized water-side economizer process that matches local conditions inside the data center.

Data Center Rack Cooling with Rear-door Heat Exchanger: This demonstration project at LBNL provides lessons-learned that may be relevant to other RDHx projects.

CPU COLDPLATE: The cold plate is the component of the liquid cooling system that interfaces with the heat source. Cold plates vary widely in complexity and construction depending on the application needs.

Direct Liquid Cooling For Electronic Equipment. Cisco servers were modified with the Asetek cold plate technology and the energy efficiency was compared to unmodified servers. The percentage of the IT power captured with the liquid was determined for a variety of IT loads and environmental conditions. The percentage of heat captured had considerable variation depending on the environmental conditions and IT load.

IMMERSION: Open-bath immersion cooling can efficiently cool high-density electronics in data centers without the need for compressor-based cooling. Since this system operates well using high temperature coolant, dry coolers can be used for heat rejection to the atmosphere, thereby eliminating evaporative water use almost anywhere in the world. Liquid immersion cooling, especially with phase change, is a paradigm shift in the way electronics are cooled. 

Immersion Cooling of Electronics in DoD Installations. Immersion cooling using 3M Novec 649 Engineered Fluid was demonstrated at the Naval Research Laboratory in Washington D.C. The technology had excellent energy efficiency performance but the technology had a couple of significant drawbacks.

NREL's Thermosyphon Cooler Hybrid System Project

In August 2016, the National Renewable Energy Laboratory (NREL) installed a thermosyphon hybrid cooling system to reduce water usage in its already extremely energy-efficient High- Performance Computing (HPC) Data Center. In its first year of use, the system saved 4,400 m3 (1.16 million gal) of water, and 7,950 m3 (2.10 million gal) during a 2-year period, cutting the use of water in the data center by about one-half.

Click here to review fact sheets, presentations, and powerpoints covering the project's installation, outcomes, and lessons learned.