AI systems may devour 49% of data center energy by 2025, study finds
At what cost does powering innovation through AI come?
-
Servers in a data center in northern France (AFP)
A recent study has estimated that artificial intelligence systems could consume up to 49% of global data center electricity by the end of 2025, The Guardian reported. The analysis, conducted by Alex de Vries-Gao, founder of the Digiconomist tech sustainability platform, is set to be published in the journal Joule.
The findings coincide with a separate warning from the International Energy Agency (IEA), which projects that AI’s total energy demand could rival Japan’s current electricity consumption by decade’s end.
De Vries-Gao’s analysis builds on current data center energy statistics, including the 415 terawatt-hours (TWh) used globally in 2023, excluding cryptocurrency mining. His estimates suggest that AI systems already account for 20% of that total and are on track to consume 23 gigawatts (GW) by late 2025, more than double the power usage of the Netherlands.
Nvidia, AMD chips drive soaring AI energy consumption
The study focuses heavily on the energy impact of Nvidia and AMD chips, the core processors used to train and run AI models. De Vries-Gao also accounts for hardware from other manufacturers like Broadcom, along with the energy-intensive infrastructure required to cool and support the servers processing high-volume AI workloads, according to The Guardian.
As AI systems scale and models become more complex, data centers are absorbing growing volumes of electricity, The Guardian reported, citing the study. Their rapid expansion has prompted concern over whether the industry's green energy goals can withstand AI’s rising demands.
Efficiency gains may spur more usage
While technological advances could improve energy efficiency, De Vries-Gao cautions that any such gains might simply accelerate AI adoption across sectors. He points to early signs of demand tapering in certain consumer-facing tools like ChatGPT but warns that countries attempting to build their own AI systems, in a trend known as “sovereign AI”, could also increase hardware demand.
According to The Guardian, export restrictions and geopolitical tensions are also shaping AI infrastructure. China’s restricted access to advanced chips, for instance, reportedly led to the development of more efficient models like DeepSeek R1, which used fewer processors.
“These innovations can reduce the computational and energy costs of AI,” De Vries-Gao notes, “but they can also lead to higher usage overall,” as reported by The Guardian.
Fossil fuels power future data centers, analysts warn
The growing energy demand has raised concerns about the carbon footprint of new AI infrastructure. De Vries-Gao cites the case of Crusoe Energy, a US-based data center startup that has secured 4.5GW of gas-powered energy for its operations, according to The Guardian. OpenAI, creator of ChatGPT, is reportedly a potential customer through its Stargate joint venture.
“There are early indications that these Stargate datacentres could exacerbate dependence on fossil fuels,” De Vries-Gao writes.
Just this week, OpenAI announced its first Stargate facility outside the US to be built in the United Arab Emirates, an oil-rich nation with a complex energy profile. Both Microsoft and Google admitted last year that their AI initiatives risk derailing corporate environmental targets.
Experts call for greater energy transparency in AI industry
Information about the true scale of AI’s energy consumption remains difficult to obtain, with De Vries-Gao describing the sector as increasingly “opaque". While the EU AI Act mandates disclosure of the energy used to train a model, it does not require reporting for day-to-day operations, a gap that sustainability advocates argue must be addressed, according to The Guardian.
Professor Adam Sobey, mission director for sustainability at the UK’s Alan Turing Institute, emphasized the need for transparency and balance. He acknowledged AI’s potential to optimize high-emission sectors like transport and utilities but warned against allowing front-end energy use to spiral unchecked.
“I suspect that we don’t need many very good use cases to offset the energy being used,” Sobey said, “but without transparency, we’re flying blind.”