News

Infineon’s AI Power Module Offers Efficiency for Data Centers

April 01, 2024 by Jake Hertz

Infineon’s dual-phase power modules could enhance power supply efficiency for AI operations in data centers

Artificial intelligence (AI) is reshaping industries and human interactions with technology. However, as AI applications expand across industries, the demand for computational power surges, straining energy resources. AI's exponential growth has fueled concerns regarding its energy consumption, and addressing this challenge requires innovative solutions in energy-efficient hardware design. 

To solve this issue, Infineon has released its TDM2254xD dual-phase power module series, aimed specifically at AI applications. This article examines the escalating power demands of AI data centers and what Infineon’s power module series can offer. 

 

TDM22544D & TDM22545D dual-phase power modules.

TDM22544D & TDM22545D dual-phase power modules. Image used courtesy of Infineon 

 

AI's Growing Appetite for Power

AI consumes electrical power primarily through the operation of its hardware components, especially CPUs and GPUs. When executing AI algorithms, these processors perform extensive computations, requiring electrical energy to power their circuits and components. Additionally, AI systems often incorporate memory modules and networking components, all consuming power during operation.

However, AI's computational demands, particularly during training, significantly impact power consumption. During training, AI models undergo iterative processes involving complex computations and heavy GPU usage, leading to high energy consumption. In contrast, power consumption is comparatively lower during inference or deployment phases, where AI applies learned knowledge to make predictions.

 

AI training process.

AI training process. Image used courtesy of Food and Drug Administration

 

For example, training extensive language models such as GPT-3 consumes roughly 1,300 MWh of electricity, equivalent to the annual power consumption of around 130 U.S. households. Streaming an hour of Netflix, on the other hand, utilizes about 0.8 kWh of electricity—meaning 1,625,000 hours of streaming are needed to match GPT-3’s power demand.

As a result, the power supply unit (PSU) within an AI server must deliver greater power than standard servers, and the efficiencies of these power supplies will significantly impact overall data center power consumption. Since data centers already consume over 2 percent of the planet's energy, developing more efficient power solutions is critical for advancing decarbonization initiatives.

 

Sparking Change With Infineon's AI Power Module

Infineon Technologies unveiled its new product designed to improve PSU efficiency for AI in data centers at this year’s Applied Power Electronics Conference (APEC). 

The TDM2254xD series dual-phase AI power modules utilize OptiMOS MOSFETs and feature a novel packaging design and a proprietary magnetic structure, resulting in enhanced electrical and thermal performance. Specifically, the design enables efficient heat transfer from the power stage to the heat sink, facilitated by a specially optimized inductor design. This optimization allows for the support of peak currents up to 160 A in compact form factors, thereby enabling vertical power delivery and reducing power delivery losses while increasing power density.

 

Infineon table

TDM2254xD features.

 

These advancements yield tangible benefits. The modules achieve an efficiency of two percentage points higher at full load than conventional solutions, reaching 89%, and operate at temperatures 5°C cooler at full load. The company claims that positioning modules near the processor and minimizing power distribution losses make it feasible to achieve currents exceeding 2,000 A.

The modules, in combination with Infineon's XDP power controller, enable efficient voltage regulation for high-performance computing platforms, offering superior electrical, thermal, and mechanical operation. By leveraging these dual-phase power modules in high-power GPU systems, Infineon believes data centers can effectively meet the increasing power demands driven by AI workloads.

 

Driving the Decarbonization Agenda

Beyond mere efficiency gains, the current development symbolizes a paradigm shift toward holistic data center optimization. By creating more efficient power modules and optimizing performance gains in every aspect of the data center, Infineon hopes to contribute to decarbonization efforts and create a future where AI can be integral to society.