Silicon Valley has a plan to save humanity: Just flip on the nuclear reactors

  • CNN
  • October 1, 2024
here.

New York

CNN

 — 

AI hasn't quite delivered the job-killing, cancer-curing utopia that the technology's evangelists are peddling. So far, artificial intelligence has proven more capable of generating stock market enthusiasm than, like, tangibly great things for humanity. Unless you count Shrimp Jesus.

But that's all going to change, the AI bulls tell us. Because the only thing standing in the way of an AI-powered idyll is heaps upon heaps of computing power to train and operate these nascent AI models. And don't worry, fellow members of the public who never asked for any of this, that power won't come from fossil fuels. I mean, imagine the PR headaches.

No, the tech that's going to save humanity will be powered by the tech that very nearly destroyed it.

Here's the deal: To do AI at the scale that the Microsofts and Googles of the world envision, it requires a lot of computing power. When you ask Chat-GPT a question, that query and its answer are sucking up electricity in a supercomputer filled with Nvidia chips in some remote, heavily air-conditioned data center.

Electricity consumption from data centers, AI and crypto mining (its own environmental headache) could double by 2026, according to the International Energy Agency.

In the US alone, power demand is expected to grow 13% to 15% a year until 2030, potentially turning electricity into a much scarcer resource, according to JPMorgan analysts.

The tech industry's solution, for now, is nuclear energy, which is more stable than wind or solar and is virtually carbon-emission-free.

Microsoft this month secured a deal to reopen a reactor on Three Mile Island, the site of the 1979 partial meltdown near Harrisburg, Pennsylvania, to give the company enough power to sustain its AI growth. (Not the reactor, of course, but another one that didn't didn't fail and continued to operate on the island for years after the incident.)

Amazon is working on putting a data center campus right on the site of a Talen Energy nuclear power plant in Northeast Pennsylvania.

Sam Altman, the CEO of OpenAI, is also heavily invested in nuclear energy and serves as the chairman of Oklo, a nuclear startup that last week received approval to begin site investigations for a "microreactor" site in Idaho.

On Monday, the Financial Times reported that the venture capital firm co-founded by Peter Thiel, Founders Fund, is backing a nuclear startup that's trying to create a new production method for a more powerful nuclear fuel used in advanced reactors.

The irony of all this is, of course, is that even AI's cheerleaders have invoked the history of nuclear proliferation to try to convey the need for guardrails around artificial intelligence (just as long as the regulations don't slow them down or curtail their profit-making in any way).

And while AI doomer predictions often get brushed off as alarmist forecasts, you can't as readily dismiss the folks who are concerned about nuclear energy. History is, tragically, on their side.

To be sure, nuclear power today is better understood than it was in 1979, when Three Mile Island's Reactor Two experienced a partial core meltdown, Anna Erickson, a professor of nuclear science at Georgia Tech, told me.

"Nothing in life is ever foolproof," she said, "but we are much better now at understanding the operation of nuclear reactors," thanks in part to the wave of safety regulations that the Three Mile Island incident set off.

Bottom line: There's no AI future without a serious uptick in our power supply, which makes the expansion of nuclear power practically unavoidable. But it will take years for many of the recently announced projects to come online, and that means Big Tech data centers will have to stay on the fossil fuel drip as demand continues spiking.

Are we all cool with wrecking the planet if all we get are apps that can summarize our emails? Or search engines that are slightly more human-sounding but less reliable? Is the future really just variations of crustacean-based deities in a churn of AI slop?

There's a lot at stake, including our jobs and the environment and our entire sense of purpose in the world, according to AI's own developers. And yet it remains unclear what we the people stand to get out of the deal.