AMD Releases Study Detailing Datacenter Energy Use In Five Regions Across Globe

December 16, 2007 by Jeff Shepard

Advanced Micro Devices (AMD) released a study revealing shifting patterns in worldwide datacenter energy use at regional levels. The study, which was conducted by Jonathan Koomey, Ph.D. (project scientist at the Lawrence Berkeley National Laboratory and a consulting professor at Stanford University), documents energy use across five regions: the United States, Western Europe, Japan, Asia/Pacific (excluding Japan) and the rest of the world. The new study forecasts datacenter energy consumption, estimating that by 2010 U.S. consumption will decline relative to consumption worldwide.

The study forecasts that, based on current growth trends, the U.S. share of total world server electricity use from datacenters will likely decline from 40% in 2000 to about one-third by 2010, while the Asia/Pacific region (excluding Japan) will increase its share from 10 to about 16% over that period. The absolute electricity consumption for servers in the Asia/Pacific region under this scenario would more than double from 2005 to 2010, requiring electricity capacity equal to output from two new 1000MW power plants. For the entire world, server consumption from 2005 to 2010 would require additional capacity equal to more than 10 additional 1000MW power plants.

"Our hope is that this research helps bridge the gap between knowledge and action by furthering worldwide understanding of the economic and environmental costs associated with escalating datacenter energy consumption," said Bruce Shaw, Director, Server and Workstation Marketing, AMD. "According to a recent U.S. EPA Report, datacenter energy consumption in the United States five years from now could be cut by as much as 20% with relatively minor efforts by datacenter managers, including turning on available power management features, enabling higher rates of resource consolidation, shutting off unused servers and improving infrastructure operations."

Dr. Koomey’s report shows that electricity used by servers in the United States and Europe currently comprise about two thirds of the world’s total, with Japan, Asia/Pacific and the rest of the world each falling at between 10 and 15% of the total. Examining electricity use by region from 2000 to 2005, the study found that server electricity use in the Asia/Pacific region (excluding Japan) grew at a 23% annual rate, compared to a world average of 16% a year, making this region the only one with server electricity use growing at a rate significantly greater than the world average. The Western European growth rate of 17% was slightly above the world average, while growth rates in the other regions were lower than the world average.

Datacenters throughout the world are designed and operated in similar ways to those in the United States. Accordingly, if the 20% savings estimated in the EPA report are applied to Dr. Koomey’s projections for global datacenter electricity use in 2010, total savings would equal approximately five 1000 MW power plants. In other words, relatively modest changes in the way datacenters are designed and operated could offset approximately half the expected growth in global datacenter electricity use in 2010.

"With the findings released today we can begin to take next steps, including examining how we can power datacenters around the world while addressing impacts on global climate," said Larry Vertal, Senior Strategist for AMD Green. "For example, coal currently provides 25% of global primary energy needs and generates 40% of the world’s electricity. Clearly, we must work harder than ever to not only deliver more efficient server and cooling technology, but also just as importantly, to work with our industry and government partners to develop environmentally sustainable solutions in areas where we see the most dramatic increases in energy use."

This new research adds detail to an AMD-sponsored study published in February that identified the worldwide costs associated with datacenter energy use, finding that in 2005 total datacenter electricity consumption in the United States (including servers, cooling and auxiliary equipment) was approximately 45 billion kWh, resulting in total utility bills amounting to $2.7 billion. That study estimated total datacenter power and electricity consumption for the world to cost $7.2 billion annually. Both of Dr. Koomey’s studies were subject to peer review by IT industry, government and energy efficiency policy professionals.