Select Language:
In the battle for AI dominance, American tech giants have the financial resources and hardware chips, but they’re now facing a new challenge: energy supply.
Microsoft CEO Satya Nadella recently admitted on a podcast with OpenAI’s Sam Altman that the main obstacle isn’t a surplus of computing power but rather the energy needed to rapidly build and maintain these systems.
“If we can’t source power efficiently, we might end up with a bunch of unconnected chips sitting idle,” Nadella explained.
Much like the 1990s internet infrastructure boom, today’s tech leaders are investing massive sums—around $400 billion in 2025 alone, with even larger figures projected for 2026—to develop the hardware backbone for AI advancement.
These investments are helping to solve initial bottlenecks, such as acquiring the millions of chips necessary for high-powered computing, while companies accelerate their own processor manufacturing efforts to compete with Nvidia, the current leader in the field.
These processors are used in the massive data centers that also demand enormous water resources for cooling. Building these centers takes an average of two years in the U.S., and connecting new high-voltage power lines can take five to ten years.
Energy Challenges
Major tech companies in Silicon Valley, often called “hyperscalers,” anticipated these energy constraints.
Just a year ago, Dominion Energy in Virginia had a data center pipeline of 40 gigawatts—enough to power 40 nuclear reactors. Now, that capacity has increased to 47 gigawatts, reflecting the rapid expansion of cloud computing hubs in the region.
Data centers in the U.S. are already linked to rising electricity bills and could consume 7% to 12% of national electricity by 2030, up from 4% today, though some experts warn predictions might be overstated.
“Both utility companies and tech firms have a vested interest in pushing these growth estimates for electricity use,” warned UC Berkeley expert Jonathan Koomey.
Like the dot-com bubble of the late 1990s, many proposed data centers may never be built, even if announced.
Potential Power Shortage
If the growth projections are accurate, a 45-gigawatt energy deficit could occur by 2028—equivalent to the electricity needs of roughly 33 million American households.
Several utilities have already delayed shutting down coal plants, despite coal’s significant greenhouse gas emissions. Natural gas, which powers about 40% of data centers globally according to the International Energy Agency, is regaining favor due to its quick deployment capabilities.
In Georgia, one utility has requested approval to install 10 gigawatts of gas-powered generators to meet rising demand. Companies and startups, including Elon Musk’s xAI, are rushing to purchase used turbines from abroad or repurpose aircraft turbines as a quick fix.
Interior Secretary Doug Burgum highlighted the urgency, saying, “The real threat isn’t just climate change; it’s the risk of losing the AI race because we lack enough power.”
Renewables, Nuclear, and Space Initiatives
Despite their public climate commitments, many tech firms are quietly scaling back on their green promises. Google, which pledged to reach net-zero emissions by 2030, removed that goal from its website this June.
They’re now focusing on long-term solutions, such as nuclear power. Amazon is advocating for a resurgence in small modular reactors (SMRs), which are easier and quicker to build than traditional nuclear plants.
Google plans to restart a reactor in Iowa by 2029, and the Trump administration announced an $80 billion investment late last year to construct ten conventional reactors by 2030.
Additionally, large-scale investments are being made in solar energy and battery storage in states like California and Texas. The Texas grid aims to add about 100 gigawatts of capacity by 2030 from these technologies.
Some innovators, including Elon Musk through Starlink and Google, are exploring space-based systems powered by solar energy, with plans to test chips in orbit by 2027.





