Smart Water Cooling for Servers and Data Centers
The trend toward digitalization and the adoption of new technologies utilizing artificial intelligence has led to a significant increase in global energy demand for data processing. A substantial portion of this consumption is attributed to servers and server cooling.
This article is published by EE Power as part of an exclusive digital content partnership with Bodo’s Power Systems.
The trend toward digitalization and the adoption of new technologies utilizing artificial intelligence has led to a significant increase in global energy demand for data processing. A substantial portion of this consumption is attributed to servers and server cooling. RK Kutting and IQ Evolution have been collaborating for some time on an effective cooling solution that reduces energy consumption while increasing the performance of the systems used. The joint project, 3D Smart Fluid Cooling, demonstrates a promising approach to significantly reduce this energy demand. As a result, it enables the secure operation of a high-performance CPU in a 1 U server, doubling the power density while reducing cooling requirements.
Figure 1. 3D smart fluid cooler. Image used courtesy of Bodo’s Power Systems [PDF]
Server Cooling in Data Centers
Server cooling in data centers is gradually transitioning from air cooling to more efficient fluid cooling solutions. The reasons behind this shift are well-known and logical: rising energy costs, reducing the carbon footprint, increasing power density, higher cooling requirements, targeted cooling options, and significantly improved heat transfer (water has 4,000 times the heat capacity compared to the same volume of air), thus greatly enhancing energy efficiency, and improved possibilities for utilizing waste heat, thereby significantly reducing the power usage effectiveness (PUE) of a data center. Moreover, more efficient cooling allows for more powerful server components, including higher processor clock speeds, substantially increasing power density within servers and entire data centers. Data centers account for approximately 1-1.5%1 of global energy consumption and CO2 emissions, so more efficient cooling methods are imperative.
Figure 2. Thermal image of a server room. Image used courtesy of Bodo’s Power Systems [PDF]
3D Smart Fluid Cooling
The 3D Smart Fluid Cooling technology combines IQ Evolution’s metal 3D printing capabilities with the proven and further developed product portfolio of RK Kutting. 3D printing produces very thin heat sinks (total thickness of 0.8 mm), allowing for highly customized designs. The adapted connection technology and flexible hoses enable application-specific solutions tailored to the available space. The overall concept can be expanded from small production runs to large-scale manufacturing. An example can be seen in Figure 1.
Figure 3. CPU on top of the 3D smart fluid cooler. Image used courtesy of Bodo’s Power Systems [PDF]
Secure Media Transport
Many exciting solutions are already available to cool individual server racks using fluids. Frequently discussed options include cooling with F-gases, pure water-cooling systems (e.g., echiller), direct chip cooling, and hot fluid computing using water temperatures around 60°C, heat pipe solutions, and immersion cooling systems. RK Kutting has long been involved in fluid cooling of safety-critical electronic components, especially in high-voltage applications, medical technology, and microchip manufacturing equipment. RK Kutting specializes in individually tailored solutions. Various cooling technologies are being implemented in the current project with IQ Evolution, and possibilities for secure and lossless media transport with easy maintenance are being developed.
The entire media routing should meet different requirements, including water, coolant mixtures, and temperature ranges. Due to spatial constraints, standard distribution systems are not feasible. Customized solutions with server cabinet-specific geometries are needed. The connection technology should be completely secure and maintenance-free. Integrated monitoring systems are suitable for continuous flow control and management.
The independent institute ZFW-Zentrum für Wärmemanagement under the leadership of Professor Griesinger from Stuttgart, was commissioned to conduct experimental investigations on the cooling performance of this cooling technology. Two servers, one with 2 U and one with 1 U, were equipped with high-performance CPUs (AMD Ryzen Threadripper PRO processor) and converted to 3D Smart Fluid Chip Cooling. These modified HP high-performance servers were installed as models in a server rack specially adapted for this cooling technology and tested under high-performance computing (HPC) operation.
Figure 4. Server rack with distribution interface. Image used courtesy of Bodo’s Power Systems [PDF]
The evaluation aims to demonstrate the potential of the targeted application of innovative fluid cooling technologies to significantly enhance server performance and computational power, ensure high operational safety and maintenance flexibility, reduce energy costs and water consumption, and drastically reduce the CO2 footprint.
Figure 3 depicts the CPU already installed with the 3D smart fluid cooler in the rack, while Figure 4 shows the distribution interfaces for cooling water on the server rack.
The results are impressive. The HPC chip can be kept at a constant temperature using different supply temperatures. Air cooling requires significant energy and takes up a lot of space, so HPC operation with air cooling is only possible in 2 U racks. Through liquid cooling, the HPC chip can now be operated in a 1 U rack for the first time, resulting in a 50% reduction in footprint.
The success in achieving this is primarily due to IQ Evolution’s patented, extremely thin 3D heat sink, which is only a few millimeters high. This breakthrough allowed the implementation of the hot fluid cooling water technology in an HP 1 U server for the first time. Through selective laser sintering, individual 3D heat sinks with impressively low height and flexible positionable connectors can be manufactured, tailored to the available space. Using preformed hoses adapted to the rack’s space enables secure media routing even in tight spaces.
Figure 5. Test equipment for server fluid cooling (evaluated at ZFW, Stuttgart, Germany). Image used courtesy of Bodo’s Power Systems [PDF]
This opens up promising new perspectives for data centers, which are now more dependent than ever on energy cost savings, power density requirements, and environmental concerns.
Results and Conclusion
With the setup described above, it was possible to operate a high-performance HP 1 U server equipped with a high-performance CPU, even under continuous full load with 280 W CPU power loss, stably and securely using fluid cooling, even at media temperatures up to 60°C.
Table 1. Initial results. Air cooling includes heat transport from the CPU via a heat pipe and subsequent cooling with fans.
|Cooling||Cooling Medium Temperature||Maximum CPU Temperature||Maximum Electrical Power|
|Air Cooling||24 °C||89 °C||24,4 W|
|Water Cooling||25 °C||69 °C||1,4 W|
|Water Cooling||60 °C||90 °C||1,7 W|
This achievement offers several advantages:
- Doubled power density by reducing the footprint by 50%: The HP 1 U server replaces the HP 2 U server.
- Increased power density by over 100% through clock rate increases: Achievable without increased risk of failures thanks to a 20 K temperature reserve.
- Substantially reduced power consumption: In this case, the power consumption of the fluid supply pump is reduced by a factor of 10 compared to the power fans.
- Increased design freedom: Optimized board, server, and rack designs can achieve even lower heights, as well as application-specific cooler designs (focusing on CPU hotspots).
- Zero cooling water consumption: Smart fluid water cooling operates in a closed loop and does not consume fresh water.
- Usable for district heating: Cooling with hot fluid/warm water above 60°C is possible while maintaining current maximum chip temperatures.
- Significant reduction in CO2 footprint: Direct power savings compared to fans. Secondary effects, such as significantly reduced space requirements for the same server performance, are not considered.
1 https://www.iea.org/reports/data-centres-and-data-transmission-networks, 23.02.2023
This article originally appeared in Bodo’s Power Systems [PDF] magazine.