Power required by AI datacenters in the US may be more than 30 times greater in a decade, with 5 GW facilities already in the pipeline..
A Deloitte Insights report, "Can US infrastructure keep up with the AI economy?" paints a picture of ever larger datacenters burning ever more energy, while the grid infrastructure and new power sources are stuck in the slow lane due to factors such as bureaucracy and supply chain disruption.
Potential solutions to this problem hinge on hopes for tech innovation making AI infrastructure more energy efficient, regulatory changes, and a massive funding injection.
By now, anyone in the tech industry is likely aware that all the hype surrounding AI is driving a boom in infrastructure-building to develop and deploy ever more sophisticated AI models.
Deloitte estimates that US datacenters last year consumed about 33 GW of energy in total, with AI facilities accounting for 4 GW of that (about one-eighth). Looking ahead to 2035, it forecasts that the power required by these facilities will be five times larger, yet the AI datacenter power draw will increase more than thirtyfold to 123 GW, accounting for 70 percent of the 176 GW total.
And there won't be just more bit barns, they'll also be bigger. The largest US datacenters operated by the big three hyperscalers currently draw less than 500 MW, according to Deloitte, but some under construction now are likely to top 2 GW.
There are also 50,000-acre datacenter campuses now in the early stages of the planning process that could consume as much as 5 GW, or about as much as 5 million homes (yes, we know some people dispute that kind of calculation).
In short, demand for energy is only going to go up. And much of the increase in demand-serving datacenter growth over the past year has primarily been met with more gas-fired electricity generation, Deloitte says, despite the much-vaunted clean energy and net-zero goals of the hyperscale operators.
The report warns there is currently a seven-year wait on some requests for connection to the grid, and power generation development typically takes longer than datacenter buildouts. The latter can be completed in a few years, while gas power plant projects without existing equipment contracts are not expected to become available until the next decade.
Supply chain issues are constraining energy companies and hyperscalers alike, as many critical components are imported and now subject to tariffs. Both industries face potential increases in the cost of steel, aluminum, copper, and cement that could impact the build-out of power infrastructure.
- Hyperscalers to eat 61% of global datacenter capacity by decade's end
- Datacenter market offers us captive customer base, say investors
- Interactive IEA tracker shows where AI is guzzling the most energy
- Digital Realty CTO on why storage is the datacenter challenge no one's talking about
Deloitte surveyed datacenter operators and power companies to ask what strategies could overcome the potential energy gap caused by the bit barn build boom. The three top responses were technological innovation, regulatory changes, and more funding.
The technological innovation approach is basically hoping that the industry can develop more power-efficient infrastructure, such as switching to optical data transmission in server rooms or using solid-state transformers within the grid.
Regulatory reform might involve changing priorities and scrapping speculative projects to streamline the process, moving grid operators to a "first-ready, first-served" approach.
The UK has faced some similar issues, which saw energy regulator Ofgem bring in a revised queue management system designed to expunge "zombie projects" and accelerate the connection process for viable ones.
Ultimately, it boils down to a need for more money. "Developing additive infrastructure will require massive funding across all of the industries involved," Deloitte states.
It notes that the impact of AI datacenter development is already becoming more apparent in investment discussions, opening the door to new funding opportunities.
The potential risks of failing to come up with a solution are that power and grid capacity constraints could hamstring AI advancement in the US, which would be just awful, and power companies could miss an opportunity to expand and modernize the grid.
"These developments could jeopardize US economic and geopolitical leadership. Staking an infrastructural lead in powering AI may now be a matter of competitiveness and even national security," Deloitte concludes. ®