Altman and Nadella need more power for AI, but they’re not sure how much

How much power is enough for AI? Nobody knows, not even OpenAI CEO Sam Altman or Microsoft CEO Satya Nadella. This uncertainty has put software-first businesses like OpenAI and Microsoft in a difficult position. Much of the tech world has focused on computing power as a major barrier to AI deployment. While tech companies have been racing to secure electricity, these efforts have lagged behind GPU purchases. Microsoft has apparently ordered too many chips for the amount of power it has under contract.

“The cycles of demand and supply in this particular case you can’t really predict,” Nadella said on a podcast. “The biggest issue we are now having is not a compute glut, but it’s a power issue and the ability to get the data center builds done fast enough close to power.” He added, “If you can’t do that, you may actually have a bunch of chips sitting in inventory that I can’t plug in. In fact, that is my problem today. It’s not a supply issue of chips, it’s the fact that I don’t have warm shells to plug into,” referring to buildings ready for tenants.

This situation shows what happens when companies accustomed to silicon and code, which scale and deploy quickly, need to ramp up their efforts in the energy world, which moves more slowly. For more than a decade, electricity demand in the U.S. was flat. But over the last five years, demand from data centers has begun to ramp up, outpacing utilities’ plans for new generating capacity. This has led data center developers to add power in behind-the-meter arrangements, where electricity is fed directly to the data center, skipping the grid.

Sam Altman, who was also on the podcast, thinks trouble could be brewing. He stated that if a very cheap form of energy comes online soon at mass scale, then a lot of people are going to be extremely burned with existing contracts they have signed. He also noted the challenges of infrastructure buildout given the rapid reduction in the cost of intelligence.

Altman has personally invested in nuclear energy, including fission and fusion startups, along with a solar startup that concentrates the sun’s heat and stores it for later use. However, none of these are ready for widespread deployment today. Fossil-based technologies like natural gas power plants also take years to build, and orders placed today for new gas turbines likely will not be fulfilled until later this decade.

This is partially why tech companies have been adding solar power at a rapid clip. They are drawn to the technology’s inexpensive cost, emissions-free power, and ability to deploy rapidly. There might be subconscious factors at play, too. Photovoltaic solar is in many ways a parallel technology to semiconductors. Both are built on silicon substrates and are modular components that can be packaged together into arrays.

Because of solar’s modularity and speed of deployment, its construction pace is much closer to that of a data center. But both still take time to build, and demand can change much more quickly than either a data center or solar project can be completed. Altman admitted that if AI gets more efficient or if demand does not grow as expected, some companies might be saddled with idled power plants.

But from his other comments, he does not seem to think that is likely. Instead, he appears to be a firm believer in Jevons Paradox, which says that more efficient use of a resource will lead to greater use, increasing overall demand. Altman explained that if the price of compute fell by a factor of one hundred, you would see usage go up by much more than one hundred as people would do new things that make no economic sense at the current cost.