AI model training could consume more than 4 gigawatts of power by 2030—enough to power entire cities—as energy demands for frontier AI development continue doubling annually, according to a new report from Epoch AI, a research institute investigating AI trajectory, and the Electric Power Research Institute (EPRI), an independent nonprofit. This exponential growth in power consumption poses significant challenges for utility companies and could derail tech giants’ climate commitments, even as companies explore distributed training and flexible power solutions to manage the unprecedented energy demands.
What you should know: Recent AI training runs like Elon Musk’s Grok AI already require 100-150 megawatts, but power demands are accelerating rapidly.
Why efficiency gains aren’t helping: Tech companies are reinvesting efficiency improvements into larger-scale operations rather than reducing overall energy consumption.
Potential solutions emerging: Companies are testing distributed training methods and flexible power systems to manage peak demand.
Climate implications: The AI boom is undermining tech companies’ net-zero commitments as they turn to both renewable and fossil fuel sources.
The big picture: This research focused specifically on training new AI models rather than broader AI infrastructure energy use, providing utilities with crucial insights for planning future power capacity and grid management strategies.