Tech Insights

How AI Creates Energy Costs and Optimization Opportunities

September 29, 2023 by John Nieman

Recent advancements in generative AI technology are creating new tools for many industries, but there will be significant energy cost increases as this technology is adopted. AI also creates the potential for new energy savings to counterbalance these costs. 

In the current cultural moment, the potential of AI seems limitless. The possibilities transcend any particular field, industry, or academic discipline because large language models like ChatGPT can program, code, and even write Shakespearean sonnets on any topic. But electricity and energy are not limitless, and as generative artificial intelligence (GenAI) expands, so will the need for power to support all of the processing and computations associated with it. 

 

Increased use of generative AI is putting a strain on power resources.

Increased use of generative AI is putting a strain on power resources. Image used courtesy of Adobe Stock

 

At the same time, GenAI can be used to optimize power plants, reducing consumption of resources and resulting in energy savings. 

 

GenAI Development and Electricity Demand  

The energy demands and financial costs of GenAI growth are already becoming apparent. If this growth continues at its current rate, infrastructure and operating costs are estimated to exceed $76 billion dollars by 2028. For a comparison benchmark, that sum is more than double the annual operating costs of Amazon’s cloud services. 

 

Projected data processing costs associated with GenAI.

Projected data processing costs associated with GenAI. Image used courtesy of Forbes

 

And this cost burden is not limited to the initial training period of GenAI, as some might assume. There are unique energy demands for both GenAI development and subsequent GenAI usage. 

 

GenAI Development and Electricity Demand  

The training of large language models like ChatGPT is a complex process of inputting massive amounts of data and text that the model assimilates to provide accurate outputs in response to user queries. The complexity of the process is one that is quantifiable from an energy perspective. 

Training just one language model can create up to 552 metric tons of carbon emissions, which is similar to driving a passenger vehicle 1.24 million miles, according to a 2019 study by the University of Massachusetts Amherst.

Given the competitive nature of AI markets, it is likely that different parties will develop more models. Google has already created the large language model Bard, a competitor of ChatGPT, and others are likely to follow. 

The electricity demand does not stop once the model is trained and developed. Every time a user asks the model to respond to prompts, generate text, or produce information, electricity is needed to support the process.  

This deployment phase of GenAI might initially seem less energy intensive given the small computing power necessary for individual inferences, the term used to describe when users prompt the model. But the cumulative impact of countless users is yet to be seen.

As millions of students learn that GenAI can offer surprisingly helpful assistance during the learning process and employers discover that it can execute various forms of labor, the number of inferences will be skyrocketing in coming years. The user adoption rate of ChatGPT alone is astonishing. It took Netflix over three years to reach one million users when it was first introduced. ChatGPT gained over a million users in five days.  

 

Using AI to Reduce Energy Consumption

Even though AI inevitably creates energy costs and GenAI requires significant amounts of electrical power to function, it is short sighted to consider these costs in a vacuum. AI will also create energy savings opportunities that can offset these costs, reducing its carbon footprint overall.   

AI is already being used to optimize solar power functionality, monitor the electrical grid and improve energy management, and streamline agricultural operations. More specifically, AI and deep learning can be used to predict the causes of grid failures, thus facilitating valuable preventative maintenance. One recent study found that an AI deep learning regression model could accurately predict the causes of large-scale power system failures 92.13% of the time. 

It is impossible to quantify the exact potential benefits across all industries, but a study by the International Energy Agency estimates that AI will be responsible for a 15% global energy savings by 2040. 

The AI frontier is an exciting one that will undoubtedly generate new costs as data computing resources grow to accommodate its usage. But AI also offers a wide variety of energy reduction opportunities across industries to help counterbalance these costs.