Market Insights

The Energy Challenges of AI’s Rapid Ascent

October 27, 2023 by Claire Turvill

With the surge in the popularity of artificial intelligence, a recent study delves into the implications of the rising electricity consumption demands necessitated by the technology's rapid expansion.

The remarkable surge in artificial intelligence (AI) during the 2020s will likely be a significant milestone for decades. The advent of AI has fundamentally reshaped how humans engage with learning, problem-solving, art, communication, and many everyday activities.

 

Generative AI could soon use as much energy as small countries like Belgium.

Generative AI could soon use as much energy as small countries like Belgium. Image used courtesy of NASA

 

Conversations surrounding the potential drawbacks of AI have grown as rapidly as the support it has garnered. These discussions encompass the implications of job displacement, ethics, security risks, and transparency.

These concerns largely focus on the impacts on the human experience, but concerns regarding the environmental impact of AI are starting to emerge. Notably, the growth of AI-related electricity consumption could soon match the total electricity consumption of some countries.

 

The Growth of AI

The idea of AI has historical roots that extend as far back as 3,000 years ago. In Homer's Odyssey, there are accounts of ships with the ability to perceive and navigate around obstacles and dangers. Myths from 400 BCE introduce Talos, a metallic humanoid figure that served as a guardian along the shores of Crete. A more contemporary literary reference is the "telespeak" voice-to-text device in George Orwell's 1984.

The birth of traditional AI as we know it‒early approaches that primarily used symbolic and rule-based systems to simulate human intelligence‒was sometime in the 1950s. One notable contribution to this decade was Alan Turing’s use of machine intelligence to decode intercepted messages during World War II, called The Imitation Game.

While AI has a long history, the present rapid growth of this industry is attributed to generative AI. 

Generative AI comprises a category of artificial intelligence capable of generating various types of content, including text, audio, code, video, images, and other data. In contrast to conventional AI algorithms that focus on recognizing patterns in training data and making predictions, generative AI employs machine learning algorithms to generate outputs derived from a training data set.

Programs like ChatGPT, Dall-E, and Midjourney have been rapidly consumed in the last year, marking the most significant technological advancement since social media. Since January 2023, ChatGPT has over 100 million monthly users, making it more popular than Instagram or TikTok.

 

AI Energy Consumption 

The swift integration of AI into our daily lives is creating a potential challenge alongside the global push for sustainable energy practices. A recent study estimates that the energy demand of the AI industry is only growing with its rapid adoption, and these figures raise significant concerns.

While data center electricity usage from 2010-2018 increased only by about 6 percent, there is growing concern that the computational resources required to develop and maintain AI models could significantly elevate data centers’ contribution to global electricity consumption.

In 2022, it was estimated that global data center electricity consumption ‒ 240-340 TWh ‒ accounted for about 1.3 percent of global electricity demand.

This study suggests that global AI-related consumption alone could reach 134 TWh by 2027. That is comparable to the annual consumption rates of countries like Argentina, the Netherlands, and Sweden.

 

Estimates of energy use by AI-powered web searches, based on a 2023 study.

Estimates of energy use by AI-powered web searches, based on a 2023 study. Image used courtesy of Joule

 

Looking at more specific incidents of AI growth, the study suggests a scenario where Google‒already a prominent AI advocate‒was to incorporate technology similar to ChatGPT into its daily 9 billion searches. This alone would result in an annual energy consumption of 29.2 TWh, equivalent to the entire electricity usage of Ireland.

Nvidia, an AI-focused software company, is set to supply 100,000 AI servers to customers in the current year. By 2027, this number is projected to surge to 1.5 million AI servers annually. The anticipated electricity consumption for this extensive server network falls between 85 to 134 TWh per year. At this stage, these servers could significantly impact global data center electricity usage.

Ultimately, the energy needs within the AI industry are bound to grow as these technologies become increasingly widespread. The study's authors do not offer suggestions for mitigating the increased electricity consumption but highlight the importance of understanding when and where to employ AI technologies.

Asking AI companies to offset the electricity consumption of their data centers with renewable energy would be a logical first step in directly addressing this issue.