People often think about tech bubbles in apocalyptic terms, but it doesn’t have to be that serious. In economic terms, a bubble is simply a bet that turned out to be too big, leaving you with more supply than demand. The upshot is that it is not all or nothing; even good bets can turn sour if you are not careful about how you make them.
What makes the question of an AI bubble so tricky to answer is the mismatched timelines between the breakneck pace of AI software development and the slow crawl of constructing and powering a data center. Because these data centers take years to build, a lot will inevitably change between now and when they come online. The supply chain that powers AI services is so complex and fluid that it is hard to have any clarity on how much supply we will need a few years from now. It is not simply a matter of how much people will be using AI in 2028, but how they will be using it, and whether we will have any breakthroughs in energy, semiconductor design, or power transmission in the meantime.
When a bet is this big, there are lots of ways it can go wrong, and AI bets are getting very big indeed. Last week, Reuters reported that an Oracle-linked data center campus in New Mexico has drawn as much as 18 billion dollars in credit from a consortium of 20 banks. Oracle has already contracted 300 billion dollars in cloud services to OpenAI, and the companies have joined with SoftBank to build 500 billion dollars in total AI infrastructure as part of the Stargate project. Meta, not to be outdone, has pledged to spend 600 billion dollars on infrastructure over the next three years. The sheer volume of these major commitments has made it hard to keep up.
At the same time, there is real uncertainty about how fast demand for AI services will grow. A McKinsey survey released last week looked at how top firms are employing AI tools. The results were mixed. Almost all the businesses contacted are using AI in some way, yet few are using it on any real scale. AI has allowed companies to cost-cut in specific use cases, but it is not making a dent on the overall business. In short, most companies are still in a wait and see mode. If you are counting on those companies to buy space in your data center, you may be waiting a long time.
But even if AI demand is endless, these projects could run into more straightforward infrastructure problems. Last week, Satya Nadella surprised podcast listeners by saying he was more concerned with running out of data center space than running out of chips. As he put it, it is not a supply issue of chips; it is the fact that he does not have warm shells to plug into. At the same time, whole data centers are sitting idle because they cannot handle the power demands of the latest generation of chips.
While Nvidia and OpenAI have been moving forward as fast as they possibly can, the electrical grid and built environment are still moving at the same pace they always have. That leaves lots of opportunity for expensive bottlenecks, even if everything else goes right.

