SHARE

Artificial intelligence programs’ impressive (albeit often problematic) abilities come at a cost—all that computing power requires, well, power. And as the world races to adopt sustainable energy practices, the rapid rise of AI integration into everyday lives could complicate matters. New expert analysis now offers estimates of just how energy hungry the AI industry could become in the near future, and the numbers are potentially concerning.

According to a commentary published October 10 in Joule, Vrije Universiteit Amsterdam Business and Economics PhD candidate Alex de Vries argues that global AI-related electricity consumption could top 134 TWh annually by 2027. That’s roughly comparable to the annual consumption of nations like Argentina, the Netherlands, and Sweden.

[Related: NASA wants to use AI to study unidentified aerial phenomenon.]

Although de Vries notes data center electricity usage between 2010-2018 (excluding resource-guzzling cryptocurrency mining) has only increased by roughly 6 percent, “[t]here is increasing apprehension that the computation resources necessary to develop and maintain AI models and applications could cause a surge in data centers’ contribution to global electricity consumption.” Given countless industries’ embrace of AI over the last year, it’s not hard to imagine such a hypothetical surge becoming reality. For example, if Google—already a major AI adopter—integrated technology akin to ChatGPT into its 9 billion-per-day Google searches, the company could annually burn through 29.2 TWh of power, or as much electricity as all of Ireland.

de Vries, who also founded the digital trend watchdog research company Digiconomist, believes such an extreme scenario is somewhat unlikely, mainly due to AI server costs alongside supply chain bottlenecks. But the AI industry’s energy needs will undoubtedly continue to grow as the technologies become more prevalent, and that alone necessitates a careful review of where and when to use such products.

This year, for example, NVIDIA is expected to deliver 100,000 AI servers to customers. Operating at full capacity, the servers’ combined power demand would measure between 650 and 1,020 MW, annually amounting to 5.7-8.9 TWh of electricity consumption. Compared to annual consumption rates of data centers, this is “almost negligible.” 

By 2027, however, NVIDIA could be (and currently is) on track to ship 1.5 million AI servers per year. Estimates using similar electricity consumption rates put their combined demand between 85-134 TWh annually. “At this stage, these servers could represent a significant contribution to worldwide data center electricity consumption,” writes de Vries.

As de Vries’ own site argues, AI is not a “miracle cure for everything,” still must deal with privacy concerns, discriminatory biases, and hallucinations. “Environmental sustainability now represents another addition to this list of concerns.”