AI will require even more energy than we thought

Forecasts suggest new power plants, often relying on fossil fuels, will be needed to feed AI’s energy demands.
Mack DeGeurin Avatar
Advanced AI models are driving up data center energy demands. DepositPhotos

It’s no secret at this point that popular generative AI tools like OpenAI’s ChatGPT have a hefty data appetite. The billions, and sometimes trillions, of parameters of information needed to train these models are housed in massive data centers that use electricity for cooling and processing power. But new predictions and forecasts suggest increasing demand for ever-more-powerful AI models could stretch current energy supplies further than we once thought. In the US alone, according to a new report released by the Electric Power Research Institute (EPRI) data centers tasked with powering advanced AI models could account for up to 9.1% of the country’s overall energy demand by the end of the decade. Much of that new demand may be met by non-renewable natural gas, which could complicate global efforts to reduce carbon emissions

The EPRI’s analysis warns widespread adoption of generative AI tools in coming years could result in a “step change in power requirements.” By 2030, the report notes, data center energy requirements could account for anywhere between 4.6% and 9.1% of total US electricity generated by 2030. That’s compared to 4% today. The newfound demand isn’t limited to the US either. By 2026, The International Energy Agency (IEA) estimates data center energy demand globally could double by 2026. 

Much of that predicted rise in demand, the report notes, stems from uniquely power-intensive generative AI models. EPRI estimates a simple query to OpenAI’s ChatGPT requires around 10 times as much electricity as a typical Google search. That wide disparity is likely due to the vast swath of training data and computing power required to make these models perform as intended. And that’s just for text responses. The amount of data generated by emerging generative AI audio and video models like OpenAI’s Sora “have no precedent” according to the report. One thing that seems clear: AI is responsible for driving data center’s growing energy demands. A recent forecast released by financial giant Goldman Sachs predicts AI alone will account for 19% of data centers’ power demands by 2028. 

Data centers are turning to fossil fuels to meet short term energy demands 

Power hungry data centers could threaten to place real strains on energy grids in coming years. As of 2024, according to the Goldman Sachs forecast, data centers account for between 1-2% of global power demand. That figure is expected to increase to 3-4% by the end of the decade. In the US, which maintains roughly half of the world’s data centers, these facilities are expected to account for 8% of the nation’s overall energy drain by 2030. Energy providers are already rushing to bring new power plants online to ensure those brewing energy demands are met. The Goldman Sachs forestate estimates more than half (60%) of energy used to meet those demands will come from nonrenewable resources. That forecast reinforces previous reports which suggest renewable resources alone might be insufficient to meet data centers’ energy needs. 

[ Related: Sam Altman: Age of AI will require an ‘energy breakthrough’ ]

The new energy requirements also further complicate past statements from tech leaders like OpenAI’s Sam Altman, who have suggested powerful AI models could play a role in reducing greenhouse gas emission in the long run. Altman, who previously said the age of powerful AI would require a “energy breakthrough” was reportedly among a handful of prominent Silicon Valley figures that recently invested $20 million in Exowatt, a startup attempting to harness solar energy to power AI data centers. 

But data centers and energy providers don’t necessarily need to wait for a technological silver bullet to try and address some of AI’s energy dilemmas. In its report, the EPRI called on data centers to investigate ways to increase internal efficiency by reducing the amount of electricity used for temperature cooling and lighting. Cooling alone, reportedly accounts for around 40% of data center energy use. Additionally, the EPRI notes backup generators powered by renewable energy sources could also play a role in supporting more reliable, sustainable energy grids. 

“Shifting the data center-grid relationship from the current ‘passive load’ model to a collaborative ‘shared energy economy’” the report notes, “could not only help electric companies contend with the explosive growth of AI but also contribute to affordability and reliability for all electricity users.”