Watch CBS News

Power demands of AI computing could put power grid under major strain

Tech experts say AI could add major stress to already strained power grid
Tech experts say AI could add major stress to already strained power grid 03:16

Computing has taken a quantum leap forward with the arrival of artificial intelligence, but with the advances come challenges. There are those who believe the rapid rise of AI is going to pose a major energy problem for the future.

When most people surf the internet, they don't give much thought to what's actually happening.  But Dr. Jonathan Koomey does. The founder of Koomey Analytics has been researching electricity use in computing since the 1990s.

"There are servers -- in other words, computers -- sitting in a big data center somewhere. And you're sending a message to them and you're saying, do this task for me.  And when you send them that message and it performs the task, it uses electricity," he said.

Now, with the emergence of AI, those huge data centers are getting bigger, and so are their power bills. 

A Goldman Sachs analysis predicts a 160% increase in data center electricity demand, saying, "That kind of spike in power demand hasn't been seen in the U.S. since the early years of this century. It will be stoked partly by electrification and industrial reshoring, but also by AI. Data centers will use 8% of US power by 2030, compared with 3% in 2022."

"When you have giant computers whose job it is to create the AI models, and then you have millions and millions of users who are sending those queries to get questions answered, that also uses a lot of electricity," said Dr. Koomey.

The problem is it takes about ten times as much energy to answer a ChatGPT query than to ask the same thing in a normal Google search. There are a lot of dire predictions about what people searching for cat videos using AI will do to the nation's power supply.  

But Dr. Koomey isn't worried.  Right now it may be wide open, but he said those paying for the expensive servers will likely end up focusing AI on applications that will make them money.

"Ultimately, they have to pay for themselves," Dr. Koomey said. "I can tell you that ChatGPT is great for entertaining 15-year-olds -- our kids love it -- but I don't know if that's a sustainable business model, right? They're probably not high-margin customers."

Dr. Koomey said new technologies are always a little rough around the edges, but it usually doesn't take long for them to become more efficient. 

"That, to me, is the most important thing for people to understand: is that electricity use could go up, but it could also level off, as it has in the past, when we get smarter about how we deliver those computing services," he said.

And getting smarter is supposed to be what it's all about.

It turns out AI is already figuring out ways to solve some of its own problems. Google used machine learning to reduce cooling costs at one of its data centers by 30-40%. 

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.