Mark Zuckerberg just announced Meta is investing “hundreds of billions of dollars” to build superintelligence… and no, that’s not just on AI talent, who are now being traded like professional athletes with multi-hundred million comp packages.
Actually, what we’re talking about is Mark’s response on threads to a recent report in SemiAnalysis that broke the news on his gargantuan new data centers.
That is, Meta's two new gigawatt-scale AI clusters:
- Prometheus (slated for 2026): Located in New Albany, Ohio, 1 gigawatt (GW) capacity with on-site natural gas generation and ultra-high-bandwidth networking.
- Hyperion (slated for 2030): Nicknamed The Beast, this one will likely be in Richland Parish, Louisiana, a $10B investment scaling from 2GW by 2030 up to 5GW.
These data centers are so massive that just one “covers a significant part of the footprint of Manhattan.” Our jaws = on the floor.
As SemiAnalysis reported, Meta ditched traditional datacenter designs and is now building multi-billion-dollar GPU clusters in tents.
Why? Speed. Traditional data centers take years; Meta needs compute yesterday (or realistically back in January, whenever DeepSeek R1 came out).
Quick side note: Now, as you might have guessed, spending this much on data centers means Meta might be going for closed source AI. Apparently, the company is having internal debates over whether or not to use these new behemoths to ditch its old behemoth, the underwhelming Llama 4 model that never really materialized.
If that's what ends up happening, that's a pretty big deal for the open-source vs closed source debate. It would mean the companies producing leading open-source models would be Mistral in Europe (who might soon get a purchase offer from Apple), and a bunch of competing Chinese AI models (including DeepSeek), and that's about it. No real U.S. competition; sure, there will be smaller open AI models like Gemma 3n, SmolLM, The Allen Institute's models, and OpenAI's open-source reasoning model (if OpenAI ever releases it, which we think they will, but you never know; they delayed it once, they could delay it again)... but not frontier model competition.
Now how are we going to power this?
Meta is pulling out all the stops on power. In the past year it’s signed deals for 1.8 GW of solar and wind (including 791 MW just last month), partnered with a startup on 150 MW of geothermal, and even agreed to finance a 20-year extension of the Clinton nuclear plant in Illinois to secure 1.1 GW more carbon-free power. Meta’s all-in – nukes, renewables, you name it – because it needs massive juice for its tents full of GPUs.
- Meta’s Renewables Ramp-Up: In June 2025, Meta inked four new contracts with Invenergy for 791 MW of solar and wind power to run its data centers, nearly doubling their clean energy partnership to 1.8 GW. (This followed Meta’s 2024 deals for 760 MW, as it races to meet AI’s surging power demand with green energy.)
- Meta’s Geothermal Bet: The company is partnering with XGS Energy to develop a 150 MW geothermal project in New Mexico, tapping underground heat for round-the-clock energy. This follows a similar 150 MW deal with Sage Geosystems last year, all aimed at diversifying Meta’s power sources to support its Los Lunas data center expansion by 2030.
- And hitting close to home for our editor Corey Noles: Meta recently signed a $509M deal for 349 MW of solar power for a new datacenter in Kansas City, Missouri.
- Don't forget Meta’s Gas Dilemma in Louisiana: To supply “Hyperion,” Meta’s planned 2–5 GW AI hub in Louisiana, utility Entergy proposed building three new gas-fired plants. This has drawn political heat, since fueling Meta’s largest data center with natural gas “flies in the face” of its climate pledges. Meta’s AI ambitions are now clashing with real-world energy tradeoffs in the Gulf South.
- Meta’s Nuclear Lifeline: Meta agreed to help keep the aging Clinton nuclear plant in Illinois operating 20 more years to secure carbon-free power for AI. The deal will boost the reactor’s output by 30 MW (to 1,121 MW), saving 1,100 local jobs and providing electricity for ~800,000 homes – part of Meta’s strategy to bring more reactors online to power its superclusters.
And on that note...
In Fact, Big Tech Has a Big Bet on Atomic Power
Many of the top tech giants are scrambling for reliable, carbon-free power, betting heavily on nuclear:
- Amazon's Susquehanna Play: AWS bought a $650M data center campus near Pennsylvania's Susquehanna nuclear plant, aiming to draw 960 MW—around 40% of the plant’s output. The "behind the meter" connection would let them bypass the grid entirely, though regulators pumped the brakes over concerns about shifting $140M in costs to regular ratepayers. In late 2024, Amazon also led a $500 M investment into SMR developer X-energy and struck deals to deploy over 600 MW of small modular reactors by the early 2030s. One pact will install four X-energy reactors (~320 MW) in Washington state, and an MOU with Dominion aims to add 300 MW near North Anna, VA. In total, Amazon and X-energy plan 5 GW of new nuclear by 2039 to power Amazon’s operations.
- Microsoft's Three Mile Island Resurrection: Microsoft signed a 20-year deal to restart the Three Mile Island Unit 1 (a shuttered 835 MW reactor). The reactor, shuttered in 2019, will come back online in 2028 (or possibly 2027) to feed Microsoft's data centers. Cost? A cool $1.6B, plus they're seeking additional federal loan guarantees.
- Google's SMR Gambit: Google inked a deal for 500 MW from Kairos Power's innovative small modular reactors (SMRs) using molten salt cooling and pebble-type fuel, promising safer and more scalable nuclear energy by 2030. The plan creates a path to deploy a fleet of Kairos’s molten-salt cooled reactors by 2035, with the first 50 MW unit by 2030. This bet complements its massive wind/solar purchases and 24/7 carbon-free goal.
There are, of course, a whole host of tradeoffs to doing things this way, however.
For example, Nuclear's drawbacks:
- Takes years to build or restart.
- Faces significant regulatory hurdles.
- Can't handle AI's rapid load fluctuations.
On those first two: For its part, the US government has been largely supportive of these efforts.
The US White House signed multiple executive orders back in May to deploy 300GW of new nuclear capacity by 2050 (roughly quadrupling U.S. capacity), with 10 new large nuclear reactors planned for 2030. This is alongside reforms to speed up NRC (nuclear) licensing.
One order even directs DOE to make HALEU reactor fuel available for private projects powering AI infrastructure at federal sites. (Yes, the government is literally stockpiling uranium specifically for AI).
And at some point today (Tuesday, July 15th), US President Trump will announce $70B in new energy investments in Pennsylvania for, you guessed it, AI.
Now, it hasn't all been full speed ahead: Earlier this month, Congress passed the One Big Beautiful Bill Act, which slashes clean energy incentives from Biden’s 2022 IRA (which would benefit AI providers and efforts to increase US energy production more broadly).
It ends tax credits after 2027 for new wind, solar, and geothermal projects and cuts deductions for energy-efficient buildings. Tech giants like Google, Meta, and Microsoft had ramped up renewables investments under those IRA incentives, and now that support is evaporating. Analysts warn this U-turn is a “net loss” for US clean energy, undermining jobs and investment. In short, the outgoing Biden era’s green light just turned yellow, even as AI’s power hunger grows.
Now, on that last negative point against nuclear: SemiAnalysis released an epic report here about the dangers of “load fluctuations” during AI training at the gigawatt scale, and how it risks grid blackouts.
The Real Threat: AI Might Literally Break the Grid
SemiAnalysis reported that the unprecedented power demands of AI training could cause instant fluctuations of tens of megawatts—something the grid wasn't built to handle. And traditional power plants, especially nuclear reactors, can't respond fast enough. PJM, America’s largest power grid, more or less confirmed the same.
The numbers are pretty scary:
- Texas has 108 GW of "large loads" (mostly giant data centers and crypto mines) requesting connection to the ERCOT grid. For context, the entire US peak load is ~745 GW. Translation: in one state alone, AI wants to add a Germany’s worth of new demand.
- The DOE predicts weeks of power shortages in Maryland, Pennsylvania, and Virginia by 2030 (that's a big deal, because Virginia is dubbed "data center alley.")
- As a result, electricity bills could surge by more than 20% this summer in parts of the PJM territory (this could impact 60M+ people during one of the hottest summers on record...).
There's a reason this investment event today is happening in Pennsylvania, too. Regulators are frustrated: Pennsylvania’s governor even threatened to quit PJM if it can’t add generation faster. PJM’s latest capacity auction saw prices spike 800%, and its CEO resigned amid the turmoil.
The bottom line: demand is outpacing supply – PJM added only ~5 GW of capacity in 2024 while retiring more than that, and it projects 32 GW of new load by 2030 (almost ALL from data centers). “We need every megawatt we can get,” PJM’s spokesperson said bluntly.
And like SemiAnalysis reported, there's huge appetite for datacenters in Texas (100 GW+). Many of these projects want to use cheap West Texas renewables, but sudden cloud cover or a mass GPU pause could drop 2+ GW in an instant, destabilizing the grid.
The nightmare scenario?
Analysts warn if 2–2.5 GW of AI load tripped off simultaneously, it could cause a cascading blackout worse than the 2021 Texas deep freeze. The state’s limited grid ties mean it can’t easily import help (Everything’s bigger in Texas – including the potential outages).
Meanwhile in Memphis: The Battery Alternative
Elon Musk's xAI is taking a radically different path from relying solely on nuclear, using Tesla Megapack batteries at its Colossus supercomputer facility:
- A gigantic 150 MW Tesla Megapack battery system helps handle the supercomputer’s wild power swings, with another 168 Megapacks recently delivered for Phase 2.
- This setup, combined with on-site turbines, let xAI get 100,000 GPUs online in just 19 days—something NVIDIA CEO Jensen Huang notes usually takes four years.
- Batteries give Colossus instant response to load spikes, and Musk proved he could build a tented AI datacenter faster than the grid could say “permit.”
As SemiAnalysis wrote, Elon showed you need to use batteries to manage these load fluctuations that could theoretically crash the grid. The only problem is batteries don’t generate power on their own... only store it.
For example, to power his 100K GPU supercomputer Colossus, Elon reportedly used 33 gas turbines (using methane).
- Before grid hookup, xAI secretly deployed 35 mobile gas turbine generators (burning methane) to power Colossus.
- Locals were told only 15 would run, but thermal imaging caught 33 turbines spewing heat – potentially 1,200–2,000 tons of NOx emissions yearly.
- xAI skirted Clean Air Act permits by calling them “temporary” (<365 days).
- And did we mention xAI’s diesel-spewing generators sparked uproar?
- Musk had to yank half of them once a new substation came online. Hence why you need batteries, not gennies.
- Now a new substation delivers 150 MW from the grid, and xAI is pulling about half of those smog-belching generators off-site this summer. (Even Elon’s speed has environmental limits).
Now, even if you were running the cleanest turbine possible, the wait list for gas turbines is now into 2028 and beyond, making this an impractical choice for Zuck’s need for speed.
This is why its largely expected that today's $70B AI and energy investment boom in Pennsylvania will involve lots of fossil fuels (we're talking oil, and even coal plants).
In fact, a new IEA report projects that AI data centers will drive a surge in gas and coal-fired power through 2030, because renewables alone won’t keep up.
This is bad news for global temperatures, making the risk of an AI-driven power outage even more dangerous as the planet heats up with more fossil fuel consumption.
So for those keeping track at home, this nets out as something like the following:
Battery Pros:
- Charge/discharge hundreds of megawatts in seconds.
- Ideal for AI's erratic load (fast frequency response keeps grid stable).
- Fast deployment (demonstrated by xAI's record-breaking 19-day rollout).
Battery Cons:
- High costs: A 100 MW system costs $38–80 M for just 2 hours of storage.
- For a gigawatt datacenter, battery costs approach a billion dollars.
- Batteries store power, but don’t generate it (you still need an underlying power source).
- And something we didn't even touch: Lithium-ion batteries can pose fire risks if not properly managed (data center battery fires, while rare, have happened, and a recent report found that more than 25% of energy storage systems have fire detection and suppression defects).
Nuclear Pros:
- Reliable, continuous baseload power (gigawatts on 24x7).
- Carbon-free once running.
- Energy-dense (a small plant powers millions of homes).
Nuclear Cons:
- Very slow to build (a decade is optimistic for a new reactor).
- Huge upfront costs and regulatory hurdles.
- Inflexible output – reactors can’t rapidly throttle power up/down, so they struggle to follow AI’s swings (as a side note, even gas turbines struggle to chase second-by-second GPU power fluctuations).
- And of course, the public perception of nuclear, which is mixed, and its famous waste disposal issues (though there are creative ideas on how to handle that...).
The (Multi-)Trillion-Dollar Question: Nuclear or Batteries?
Who’s right?
- Nuclear provides reliable, continuous, carbon-free power but struggles with rapid power swings and faces long build times and regulatory complexities.
- Batteries handle rapid fluctuations efficiently and deploy quickly but are costly and only provide short-term storage.
The answer? Probably some cocktail of nuclear power, renewables (the cheapest and most efficient energy option), and whatever gas, oil, and coal plants are available.
To us, the best answer is nuclear for stable baseline energy, with some form of storage (hydro or geothermal if available, batteries for scalability) to handle rapid fluctuations.
Yet even combined, they might fall short. Meta’s engineers already use costly measures like the command “pytorch_no_powerplant_blowup=1” to smooth workloads—at a cost of tens of millions annually in essentially wasted electricity.
So really, the real answer may be more efficient AI model architecture… (i.e., do more compute with less power – but that’s a topic for another day.)
Our Take
All of this AI-powered infrastructure is without question an existential threat to grid stability. America's century-old power grid was never built for AI’s wild power swings, and large swaths of it haven't been updated since the 60s and 70s.
Therefore, the race for AI supremacy is becoming a race for electrons. And, ironically, building superintelligence to solve humanity’s greatest challenges might accidentally cause rolling blackouts nationwide (and increase global temperatures while they're at it).
As one datacenter engineer summed it up: "We're literally building the future in tents while praying the lights stay on." Very Oregon-trail of you, Meta.
We also can't help but acknowledge the ultimate irony of naming these supercomputers after the "titans" of Greek mythology (Colossus, Hyperion) who were later overthrown by the smaller, lesser Greek gods. IDK if that's a parable for the tech companies themselves, or the rise of smaller, more efficient open models, but it's got a bit of divine comedy to it.
In all seriousness, it really does seem that the most likely solution to AI data center needs will involve some form of solar power and batteries. Why? Solar is the #1 growing power source atm.
The numbers are pretty wild:
- The world is now installing one gigawatt of solar infrastructure every 15 hours—that's equivalent to building a new coal plant every day and a half.
- And the pace is accelerating: it took 68 years (1954-2022) to build the first terawatt of solar power, but just two years to hit the second terawatt in 2024.
- The third terawatt? Expected within months.
- Meanwhile, renewables captured 96% of global demand for new energy in 2024.
- In the US alone, 93% of new energy capacity came from solar and wind.
- There's overwhelming evidence that solar energy makes electricity cheaper and the grid more stable—exactly what these power-hungry AI data centers need.
The efficiency gains are bonkers too. Thanks to manufacturing breakthroughs and streamlined installation, the silver used in one 2010 solar panel would be enough for around five panels today. That's a 5x improvement in material efficiency in just over a decade.
It's now estimated that 80% of the US' energy needs could be met by solar and batteries together. Keep in mind, it's been projected that 100% of energy demand is impractical to be met by solar and batteries, so some of that demand will need to be made up of gas turbines, nuclear, and whatever else we have running at the time. But with the current trend of cheaper solar and battery equipment, getting to something like 90-95% solar and batteries is definitely plausible.
Here's where it gets geopolitically spicy:
China installed more solar infrastructure in 2023 than the next nine countries combined. They have a 10-year plan to triple their solar capacity by 2030, but at their current pace, they could hit that target as early as 2026. As of early 2025, China's solar capacity was clocked at about 44% of the world's total, at 1.4 Terrawatts. China installed 93 GW of solar in May alone. That's about 30 nuclear power plants worth of power. Imagine trying to install 30 nuclear power plants in one month!! Actually, China IS trying to install 30 nuclear power plants, but over the next two decades (we'll get to that in a sec).
These pictures show the scale of some of China's large solar deployments, and they're pretty amazing. China is basically speed-running energy independence through solar dominance. And they're not sleeping on nuclear, either; they just approved 10 new reactors back in April, and currently have 30 in development (which is just about half of the entire amount of nuclear power plants under construction around the world), and should double its total nuclear capacity by 2040. It's even experimenting with older nuclear power plant ideas, and new ideas, too (like its artificial sun project).
So, what do you think they're going to do with all that power?
Chinese tech giants (Baidu, Alibaba, Tencent, etc.) will benefit from a surplus of new power plants for their autonomous cars, robots, and AI systems, of course, while in the West, AI growth threatens to outpace energy infrastructure. As one analyst noted, "China is building the electrons their AI needs. We're debating where ours will come from."
P.S: this is a great article on how to build an AI data center, and this one covers the power dynamic more in depth, in case you're curious!