AI for Energy: Why Geothermal, Data Centers, and Power Are Becoming One Story
In October 2024, Constellation Energy's stock surged 22% in a single day. The catalyst wasn't a new energy technology or a regulatory windfall. It was a 20-year power purchase agreement with Microsoft to restart the Three Mile Island nuclear reactor — yes, the one that partially melted down in 1979 — to supply electricity for AI data centers. The deal crystallized something that had been building for months: the AI industry has an energy problem so large that it's reshaping the entire power sector.
This isn't a niche infrastructure story. It's the story that will determine who wins and who loses in AI over the next decade. The companies building the most capable AI models are simultaneously becoming energy companies. Google is investing in geothermal startups. Amazon is buying nuclear-powered data centers. Meta is hiring nuclear energy experts. The race for compute has become a race for power — and the implications ripple from Silicon Valley to rural power grids to international energy markets.
Here's why energy and AI are becoming one story, and what it means for the future of both industries.
The Scale of AI's Energy Appetite
Let's start with numbers, because the numbers are staggering.
Current Data Center Energy Consumption
In 2024, data centers consumed approximately 1.5% of global electricity — roughly 400 terawatt-hours (TWh). To put that in perspective, that's more electricity than the entire country of France uses in a year. The United States, home to the largest concentration of data centers, saw data centers consume about 4.4% of national electricity generation.
These numbers were already significant before the AI boom. Traditional data center workloads — cloud computing, streaming, e-commerce, enterprise applications — had driven steady growth in energy consumption for two decades. But AI has changed the trajectory from linear to exponential.
The AI Multiplier
AI workloads are dramatically more energy-intensive than traditional computing workloads. A standard Google search consumes approximately 0.3 watt-hours of electricity. A ChatGPT query consumes approximately 3-10 watt-hours — roughly 10-30 times more. An AI image generation request consumes even more. Training a frontier model like GPT-4 consumed an estimated 50-100 gigawatt-hours — enough electricity to power 4,500 average U.S. homes for a year.
And that's just training. Inference — the process of actually running trained models to serve user requests — consumes far more electricity in aggregate because it happens billions of times per day. As AI usage grows, inference energy consumption is scaling faster than any other category of computing workload.
Projections: Doubling by 2028
The International Energy Agency projects that global data center electricity consumption will more than double by 2028, reaching 945 TWh — surpassing the total electricity consumption of Japan. In the United States, Goldman Sachs estimates that data center power demand will grow at a 15% compound annual rate through 2030, requiring the equivalent of adding the entire electrical capacity of a mid-sized U.S. state.
These projections assume that AI adoption continues at current rates. If AI usage accelerates — which most industry observers expect — the actual numbers could be significantly higher.
Why AI Companies Are Becoming Energy Companies
The traditional model for powering data centers was simple: buy electricity from the grid. Build a data center, connect to the local utility, negotiate a commercial rate, and pay the bill. The utility handles generation, transmission, and reliability. The data center operator focuses on computing.
This model is breaking down for AI infrastructure, and the breakdown is forcing AI companies to become energy companies — whether they want to or not.
Grid Capacity Constraints
The electrical grid was not built for concentrated, baseload power demand at the scale that AI data centers require. A single large AI data center can demand 200-500 megawatts of continuous power — equivalent to the electrical demand of a city of 200,000-500,000 people. In many regions, the grid simply cannot deliver this much power to a single point of consumption.
The problem is compounded by lead times. Building new power generation capacity takes 3-7 years for natural gas plants, 7-15 years for nuclear plants, and 2-4 years for large-scale solar or wind installations. Upgrading transmission infrastructure to deliver power from remote generation to data center locations takes 5-10 years. Grid interconnection queues in the United States are now averaging 5 years — meaning that a data center planned today may not have adequate grid power until 2031.
AI companies can't wait that long. The competitive pressure to deploy more compute is immediate. This is why they're increasingly bypassing the grid entirely — securing dedicated power sources that connect directly to their data centers.
Google's Fervo Geothermal Deal
Google's partnership with Fervo Energy represents one of the most ambitious approaches to the AI energy problem. Fervo is a geothermal energy startup that uses techniques borrowed from the oil and gas industry — specifically horizontal drilling and hydraulic stimulation — to access geothermal heat sources that were previously unreachable.
The deal works like this: Fervo builds geothermal power plants near Google's data center locations. These plants generate 24/7 baseload power — unlike solar (daylight only) and wind (intermittent), geothermal provides constant, reliable electricity regardless of weather or time of day. The electricity is delivered directly to Google's data centers, bypassing the grid entirely.
Google's initial Fervo deployment in Nevada demonstrated the technology's viability, and the companies have expanded their partnership significantly. Google has committed to purchasing geothermal power from Fervo at scale, with planned deployments across multiple sites. The partnership represents Google's recognition that achieving its climate commitments while expanding AI infrastructure requires new categories of clean energy — not just more solar panels and wind turbines.
Microsoft's Three Mile Island Nuclear Deal
Microsoft's decision to purchase power from the restarted Three Mile Island Unit 1 reactor (the unit that didn't melt down — Unit 2 was the one involved in the 1979 accident) was a watershed moment. The 20-year power purchase agreement provides Microsoft with approximately 835 megawatts of carbon-free baseload power — enough to power a significant portion of their AI infrastructure.
The deal is notable for several reasons. It signals that Microsoft is willing to invest in long-duration power commitments — 20 years — to secure AI energy supply. It rehabilitates nuclear power as a viable option for tech companies. And it demonstrates the sheer scale of AI's power appetite: restarting a nuclear reactor is economically viable because the demand for clean baseload power is that large.
Amazon's Nuclear Investments
Amazon has taken a portfolio approach to nuclear energy for its data centers. The company has invested in small modular reactor (SMR) companies, purchased existing nuclear-powered data center campuses, and signed power purchase agreements with nuclear generators. Amazon's approach reflects a bet that nuclear energy will be the backbone of AI power supply — providing the reliable, carbon-free, 24/7 generation that intermittent renewables cannot.
AI-Driven Geothermal Discovery: Using ML to Find Underground Heat
One of the most fascinating intersections of AI and energy is the use of machine learning to discover geothermal resources. Traditional geothermal exploration relied on surface indicators — hot springs, volcanic activity, temperature gradients measured in exploratory wells. This limited geothermal development to regions with obvious geological signatures, primarily volcanic areas like Iceland, New Zealand, and parts of the western United States.
New AI-driven exploration techniques are changing this. Companies like Fervo, Zanskar, and others are using machine learning models trained on geological data — seismic surveys, gravity measurements, magnetic field data, satellite imagery, historical well data — to identify potential geothermal resources in locations that traditional exploration methods would have overlooked.
The approach borrows heavily from the oil and gas industry's use of AI for exploration. The same techniques that identify subsurface oil and gas formations — analyzing seismic data to create 3D models of underground structures — can be adapted to identify hot rock formations suitable for geothermal energy production.
Early results are promising. AI-assisted geothermal exploration has identified potential sites in regions previously not considered geothermal-viable, including parts of the eastern United States, the Midwest, and northern Europe. If these sites prove productive, they could dramatically expand the geographic footprint of geothermal energy — making it available near data center locations that are currently far from traditional geothermal resources.
Why Solar and Wind Alone Won't Power AI
This is a crucial point that gets lost in the clean energy enthusiasm: solar and wind energy, while essential components of the energy transition, cannot alone meet AI's power requirements.
The Baseload Problem
AI data centers need power 24 hours a day, 7 days a week, 365 days a year. They cannot tolerate power interruptions — even brief outages can crash training runs that have been running for weeks, wasting millions of dollars in compute. Solar panels generate electricity for roughly 6-10 hours per day. Wind turbines generate electricity when the wind blows, which varies unpredictably.
Battery storage can bridge some gaps, but current battery technology cannot economically store enough energy to power a 500-megawatt data center through extended periods of low solar/wind generation. The cost and physical footprint of the required battery capacity are prohibitive.
The Capacity Factor Reality
Solar installations in the United States have an average capacity factor of about 25% — meaning they generate 25% of their theoretical maximum output over a year. Wind installations average about 35%. Nuclear plants operate at a capacity factor of approximately 93%. This means that to replace one nuclear plant's output, you need roughly 3-4 times the nameplate capacity in solar or wind installations — plus sufficient battery storage to smooth the intermittency.
The Land and Permitting Challenge
A 500-megawatt solar installation requires roughly 3,000-4,000 acres of land — about 5-6 square miles. Siting, permitting, and building this much solar capacity takes years and faces increasing local opposition. Nuclear plants, by contrast, produce the same power from a site of about 1 square mile. Geothermal plants require even less land.
For AI companies that need large amounts of reliable power quickly, the land, permitting, and intermittency constraints of solar and wind make them insufficient as standalone solutions. They're valuable as part of a portfolio, but the backbone of AI power must be baseload generation — nuclear, geothermal, or natural gas.
The Natural Gas Renaissance
Here's an inconvenient truth for the clean energy narrative: natural gas is currently the fastest and most practical way to add large-scale power generation for AI data centers. Natural gas power plants can be permitted and built in 2-3 years, can be sited near data center locations, provide reliable baseload power, and have well-understood economics.
AI companies that have made ambitious climate commitments face a tension: use natural gas now to meet immediate power needs, or wait years for nuclear/geothermal projects to come online while competitors deploy compute and capture market share. Most are choosing natural gas as a bridge fuel — committing to transition to cleaner sources over time while using gas to meet immediate demand.
This pragmatic approach has drawn criticism from environmental advocates, but it reflects a real constraint. The alternative to natural gas isn't clean energy today — it's no energy today, which means no AI infrastructure, which means ceding the market to competitors in regions with fewer environmental standards.
Energy Arbitrage: Locating Data Centers Where Power Is Cheapest
The economics of AI energy are creating a new geography of computing. Data center location decisions are increasingly driven by power availability and cost rather than proximity to customers or engineering talent.
The New Data Center Geography
- Northern Virginia remains the largest data center market (home to "Data Center Alley" near Ashburn), but grid capacity is nearly exhausted. Dominion Energy has warned that connecting new data centers to the grid could take until 2030 or later.
- Texas is surging as a data center market, driven by deregulated energy markets, abundant natural gas and wind generation, and favorable regulations. ERCOT (the Texas grid operator) has over 100 gigawatts of data center interconnection requests in its queue.
- The Nordics (Sweden, Finland, Norway) attract data centers with abundant hydroelectric power, cold climates that reduce cooling costs, and stable regulatory environments.
- The Middle East (UAE, Saudi Arabia) is investing heavily in AI data center infrastructure, offering cheap energy from natural gas and solar, plus sovereign wealth fund investment in AI infrastructure.
- Rural America is seeing unexpected data center development, as companies locate facilities near power generation sources (nuclear plants, natural gas facilities) rather than in traditional tech hubs.
The Latency vs. Cost Tradeoff
For AI training, location doesn't matter much — training can happen anywhere with sufficient power and network connectivity. But for AI inference (serving real-time requests to users), latency matters. This creates a two-tier geography: training data centers located wherever power is cheapest, and inference data centers located near population centers where low latency is essential.
Grid Capacity as the Binding Constraint
The ultimate constraint on AI growth may not be chip supply, model architecture, or investment capital. It may be the electrical grid.
In the United States, the grid is a patchwork of aging infrastructure built over decades for a very different pattern of electricity consumption. It was designed for distributed residential and commercial loads, with power flowing from large central generating stations to dispersed consumers. AI data centers invert this pattern — concentrating enormous electrical loads in single locations, often in areas with limited grid infrastructure.
The grid upgrades required to support AI's growth include new high-voltage transmission lines, upgraded substations, new distribution infrastructure, and expanded generation capacity. These upgrades require regulatory approval, environmental review, right-of-way acquisition, and construction — a process that typically takes 5-10 years.
This timeline mismatch — AI companies need power in 1-2 years, grid upgrades take 5-10 years — is the fundamental tension driving the energy-AI convergence. It explains why AI companies are pursuing direct power generation, nuclear restarts, and other strategies that bypass the grid entirely.
For those following the intersection of energy and AI — whether you're tuning into TBPN's daily infrastructure coverage or researching energy investments — this convergence is reshaping both industries. Stay plugged in with a TBPN t-shirt that signals you understand where AI is really heading.
What This Means for the Next AI Winners
The companies that win the next phase of AI won't just have the best models. They'll have the best energy strategy.
This is already visible in the competitive dynamics among the AI giants. Google's geothermal investments, Microsoft's nuclear deals, and Amazon's nuclear portfolio aren't just ESG initiatives — they're strategic infrastructure that will determine how much AI compute each company can deploy over the next decade. A company with secure, long-term power supply at competitive rates has a fundamental cost advantage over competitors who rely on increasingly expensive and constrained grid power.
For startups, the energy constraint is both a challenge and an opportunity. The challenge: training frontier models requires power that most startups can't secure independently. The opportunity: companies that solve energy problems for AI — whether through next-generation nuclear, geothermal innovation, energy-efficient chip design, or novel cooling technology — are building businesses with enormous addressable markets and durable competitive advantages.
The energy-AI convergence also has implications for national competitiveness. Countries with abundant, reliable, and affordable energy have a structural advantage in AI development. The United States, with its diverse energy resources and established data center infrastructure, is well-positioned — but only if it can solve the grid capacity and permitting bottlenecks that are currently constraining growth. For the latest on this evolving landscape, the TBPN daily show covers energy-AI infrastructure in depth — and a TBPN tumbler makes the perfect companion for those long research sessions.
Frequently Asked Questions
How much electricity does AI actually consume compared to traditional computing?
AI workloads are approximately 10-30 times more energy-intensive than traditional computing tasks. A standard web search consumes about 0.3 watt-hours, while an AI chatbot query consumes 3-10 watt-hours. Training a frontier AI model can consume 50-100 gigawatt-hours — enough to power thousands of homes for a year. In aggregate, AI-related electricity consumption is projected to drive global data center power demand from approximately 400 TWh in 2024 to over 945 TWh by 2028, according to the International Energy Agency.
Why can't solar and wind power meet AI data center energy needs?
The core challenge is intermittency. AI data centers require continuous 24/7 power with extremely high reliability. Solar panels only generate electricity during daylight hours (average capacity factor of 25%), and wind turbines only generate when the wind blows (average capacity factor of 35%). While battery storage can bridge short gaps, current technology cannot economically store enough energy to power large data centers through extended low-generation periods. This is why AI companies are increasingly investing in baseload generation sources like nuclear and geothermal, which provide reliable 24/7 output.
What is Fervo Energy, and why did Google partner with them?
Fervo Energy is a geothermal energy startup that uses horizontal drilling and hydraulic stimulation techniques — borrowed from the oil and gas industry — to access underground heat sources that traditional geothermal technology couldn't reach. Google partnered with Fervo because geothermal energy provides carbon-free, 24/7 baseload power, which aligns with both Google's AI power requirements and its climate commitments. The partnership has expanded from an initial Nevada pilot to planned deployments across multiple sites.
Is nuclear energy safe and practical for powering AI data centers?
Modern nuclear energy has an excellent safety record — no fatalities from radiation at U.S. commercial nuclear plants in their entire operating history. Nuclear plants provide reliable baseload power at approximately 93% capacity factor, generate no carbon emissions during operation, and require minimal land compared to solar or wind. The challenges are primarily economic and regulatory: high upfront costs, lengthy permitting timelines (7-15 years for new plants), and public perception concerns. Small modular reactors (SMRs), which are smaller, factory-built, and potentially faster to deploy, are generating significant interest as a solution for AI data center power.
