The tech giant revealed agreements with Indiana Michigan Power and the Tennessee Valley Authority to reduce energy consumption from machine learning workloads during peak demand periods at its regional data centers.
“This builds on our successful demonstration with Omaha Public Power District (OPPD), where we reduced the power demand associated with ML workloads during three grid events last year,” Michael Terrell, Google’s head of advanced energy, said in a company blog post.
The initiative involves shifting non-urgent computing tasks—such as processing YouTube videos—away from periods when electrical grids face peak strain. Google leverages these demand management capabilities to help grid operators maintain system reliability during high-demand periods.
“As AI adoption accelerates, we see a significant opportunity to expand our demand response toolkit, develop capabilities specifically for ML workloads, and leverage them to manage large new energy loads,” Terrell said. “
By including load flexibility in our overall energy plan, we can manage AI-driven growth even where power generation and transmission are constrained.”
Industry experts emphasise the critical nature of such demand-side solutions as AI development accelerates nationwide.
“Demand-side solutions are absolutely critical for aligning growth with grid reliability,” said Pete DiSanto, senior vice president of data centers at Enchanted Rock, a Houston-based electrical resiliency company.
“Without demand-side solutions, the grid simply won’t be able to keep up with the scale and speed of AI data center growth, especially in regions already facing capacity and interconnection challenges. These tools are the key to enabling rapid expansion without breaking the grid.”
Power Capacity Expected to Triple by 2030
The expansion comes as industry forecasts project dramatic increases in data center electricity consumption. Morningstar Research Services estimates U.S. data center power capacity will approximately triple to 80 gigawatts by 2030, driven primarily by generative AI applications.
However, Morningstar’s projections remain more conservative than other industry estimates predicting 100-gigawatt capacity during the same timeframe.
“We believe such forecasts overlook the practical limitations associated with building large-scale infrastructure and also underestimate the long-term rising energy efficiency of AI chips,” the firm noted in its July report “Powering Tomorrow’s AI Data Center.”
Energy infrastructure experts warn of potential grid failures without proactive management strategies.
“We don’t have sufficient generating capacity for both the existing energy loads and AI data centers,” said Rob Enderle, president and principal analyst with the Enderle Group, a Bend, Oregon-based advisory firm,”
“It is critical that demand-side solutions be created to mitigate what would otherwise be long periods of brownouts or full outages to keep the grid from failing catastrophically, requiring much of it to be rebuilt.”
Mark N. Vena, president and principal analyst with SmartTech Research in Las Vegas, agreed. “As AI workloads explode, the electrical grid simply can’t keep up unless demand can flex in real time,”
“Demand-side strategies like shifting compute loads or pausing non-urgent processes help avoid blackouts while still meeting data center needs.” he said.
Flexibility Essential for Data Centers
Demand-side solutions aren’t just important; they’re becoming a prerequisite for growth, maintained Wyatt Mayham, head of AI consulting at Northwest AI Consulting (NAIC), a global provider of AI consulting services.
“Demand-response agreements allow data centers to act like a virtual power plant, providing grid stability that utilities desperately need,”
“For the data center, it’s a new revenue stream and, more importantly, a ticket to the front of the line for power allocation,” he said.
Looking Ahead
As the artificial intelligence revolution continues to reshape technology infrastructure demands, Google’s expanded energy management initiatives signal a broader industry shift toward grid-conscious operations.
The success of these demand response programs could determine whether the rapid expansion of AI capabilities can coexist with existing power infrastructure or whether more dramatic grid investments will be required to prevent widespread energy disruptions across the United States.
Google’s data center electricity demand has grown by 27% over 2024, compared to 17% reported last year, according to its latest Environmental Report
However, the tech giant reported that it had reduced its data center energy emissions by 12 percent in 2024 compared to the previous year, marking the first year-on-year reduction in data center energy emissions since 2019.
According to Google much of the reduction was the result of more than 25 clean energy projects it had contracts with coming online over the calendar year.

