Artificial Intelligence

AI’s electricity needs are more than America’s grid can handle


Hello, Quartz members!

It’s an AI world we are entering — but its also one that may not have the juice to power it.

Take the case of Nvidia’s newest AI chip, an accelerator GPU known as Blackwell, which is a four-inch square assemblage of silicon and wiring etched with 200 billion transistors. When hooked into an array with thousands of identical processors, it can handle the world’s largest artificial intelligence tasks. Nvidia says Blackwell uses 25 times less power than its predecessors to do the same amount of data processing, making it possible to swiftly and efficiently build new AI uses. But every Blackwell chip also consumes 1,200 watts of electricity. That’s just about enough to power the average U.S. home.

With artificial intelligence applications doing everything from creating new drugs to driving your car and organizing your personal calendar, the demand for AI is ballooning. New AI chips promise faster and better results, and data centers to house all the new AI are being planned and built around the world. But AI is facing a very real power crunch. AI chips can use 10 times as much electricity to respond to a query as an algorithmic Google search (2.9 watthours versus 0.3). And that’s posing an existential threat to the rapid adoption of AI. In fact, the portion of America’s electricity devoted to data centers, where most AI calculations take place, is expected to rise from about 4% of U.S. electric consumption today to 9.1% by 2030.

That may look like a great opportunity for power companies and utilities, but electricity is a complex business. And between regulatory, reliability and financial issues, it does not move quickly. The truth is that the U.S. electrical grid (like that of other countries, too) is simply not ready for the AI revolution.

“There’s a freight train coming with AI,” said Jeff Jakubiak, an energy regulatory attorney with the law firm Vinson and Elkins. “There has never been such potential for imbalance between supply and demand in the electric grid as we are seeing coming down the track with AI.”

Industry experts say there are about 1,000 large data centers around the world, half of them in the U.S., drawing 500 megawatts to 1 gigawatt of electricity each. Amazon, Google and other cloud computing platforms are planning to bring another 500 hyperscale data centers online over the next five years or so, largely to handle the explosion in demand for AI applications.

Today’s data centers use 460 terawatt-hours (TWh) of electricity a year, as much as is used by the entire country of Germany. By 2030, U.S. data centers will use as much electricity as 40 million homes.

Rene Haas, the CEO of chipmaker Arm, wrote in a recent blog post that as AI models become larger and smarter, they’ll use more power — but without the power, we won’t get the AI revolution. “In other words,” as Haas put it, “no electricity, no AI.”


High on your own supply

Building your own power is one possible way out. It’s called “behind the meter” power generation in electric jargon. But data centers that produce their own power can end up being a little too much of an island. If they’re not connected to the grid, there’s a significant risk that when their power generator goes down, they’ll be stuck.

While the largest cloud computing firms aren’t yet building their own power plants — there’s no word yet that everyone’s favorite overnight shopping site is creating Amazon Nuclear Power Inc. — what they are doing is helping new power sources come online with what’s called a Power Purchase Agreement, a multi-year (or multi-decade) commitment to buy energy from new sources. Amazon recently announced its agreement with the U.S. utility AES Corp., for a long-term PPA from a massive 150 MW solar field in the Mojave Desert.

By creating supplies like this that can be plugged into the national power grid, Amazon is trying to insulate its server farms from being shut down in a power shortage. The solar farm serves another purpose: It can be hard to get permits for new power plants, especially if they run off fossil fuels. But solar farms are much easier to get permitted, and far quicker to get online. In fact, Amazon is already sourcing more than 90% of its electricity from renewable sources and aims to go fully renewable by 2030.


The connection conundrum

Creating enough energy to supply data centers is a multi-faceted problem, says Marcus McCarthy, senior vice president of Siemens Grid Software, a unit of the German-based engineering giant that develops software for running electricity grids. Designing and building an efficient data center takes about two years. But just getting the permits can take much longer, and building power generating capacity can also take longer. Add to that the demand for energy that doesn’t boost greenhouse gas emissions, and the challenge is formidable.

“Connecting these to the grid will be a challenge, to the power industry, generating enough energy is a challenge, and installing it in a sustainable manner is a challenge,” McCarthy said.

Perhaps the biggest challenge is the connection. Electrical grids run on a knife’s edge. If they generate more electricity than they can “take off” or use, they can overheat and, effectively, short-circuit. If there’s more demand than they can provide, they can also short circuit, or have to enforce rolling blackouts.

That means utilities have to undertake very careful studies of the effects of adding loads like that of an AI data center. That’s caused a multi-year backup across the U.S., especially in places like Virginia, where already almost a fourth of the electricity supply goes to data centers.

Grid operators recall the catastrophic years of 2000 and 2001 in California, when a combination of deregulation, bad weather and a mismanaged grid left California with rolling blackouts.

“That’s potentially what you’re looking at here,” Jakubiak said, unless grid operators manage data center connections successfully. “If the grid operators are forward-looking about this they will refuse to allow demand to hook up to the grid unless there is sufficient supply.”

By 2030 or so, Jakubiak predicts, the power shortage will cause “a slowdown in what people are predicting for AI growth.”


Thanks for reading! And don’t hesitate to reach out with comments, questions, or topics you want to know more about.

Have an intelligent weekend!

— Peter Green, Weekend Brief writer



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.