The global rush to build infrastructure for artificial intelligence (AI) is reaching unprecedented levels, but a critical question looms: is this a sustainable boom or a bubble waiting to burst? The staggering scale of investment, now potentially exceeding earlier forecasts, is drawing uneasy comparisons to the telecoms frenzy of the late 1990s.
The Staggering Scale and the Demand Dilemma
In a remarkably bullish projection this spring, consultancy giant McKinsey forecast worldwide capital spending of $5.2 trillion over the next five years on chips, data centres, and energy for AI. By late 2025, the firm was already considering revising that estimate even higher. This fever pitch is driven by massive announcements from key players like OpenAI, Nvidia, and Oracle, all aiming to build the computing power they believe generative AI requires.
However, a significant gap is emerging. The revenue-generating demand for generative AI has not kept pace with the explosive growth in supply. While consumer use of chatbots is increasing, McKinsey partner Pankaj Sachdeva reveals a sobering statistic: the success rate of AI pilot projects in surveyed firms is below 15%. He predicts a prolonged period of "lumpiness" between supply and demand, which could last for years. The ultimate strength of this demand is the single most critical factor that will determine if this infrastructure surge ends in a bust.
Novel Risks in the AI Building Frenzy
Beyond the demand question, three novel aspects of the current data-centre boom are amplifying uncertainty. First is geography. Unlike traditional clusters near demand hubs like northern Virginia, new AI data centres are springing up in remote areas with abundant renewable energy and space, such as parts of Texas, North Dakota, and New Mexico. Projects like the $500 billion "Stargate" initiative announced by former President Donald Trump exemplify this trend.
While this solves power shortages for training large language models (LLMs), it introduces property risks. Gautam Bhandari of I Squared Capital warns that returns may not adequately reflect the risk of these isolated assets becoming obsolete quickly due to relentless technological advances, like Nvidia's ever-more-efficient chips, which could require costly, frequent upgrades.
The second novelty is finance. The gigawatt-scale appetite of AI data centres, with costs hitting $50 billion per gigawatt, has outstripped the capacity of traditional financiers like Real Estate Investment Trusts (REITs). Their place is being taken by private-credit firms, sovereign-wealth funds, and banks. This shift moves risk into the debt markets, potentially exposing the banking system more directly if defaults rise.
Third is credit quality. The pool of borrowers has expanded beyond cash-rich cloud giants like Amazon and Microsoft—"the best tenants in the world," according to David Guarino of Green Street—to include AI labs like OpenAI and "neocloud" firms that rent out GPUs. This expansion increases the number of players but decreases the average creditworthiness, raising default risks and making conservative utilities hesitant to sign long-term power contracts.
Echoes of History and a Tantalising Future
For sceptics like Andrew Odlyzko of the University of Minnesota, the parallels with past infrastructure manias, particularly the dotcom-era telecoms bubble, are growing. He notes that proposed deals, such as Nvidia's potential $100 billion investment tied to GPU sales to OpenAI, remind him of the vendor-financing arrangements that fuelled the last bust. The involvement of many more firms now makes him "much more alarmed" about the potential economic impact of a downturn.
Not everyone agrees with the dire comparison. Some analysts point out a key difference: today's data centres are typically built only after counterparties sign contracts, unlike the speculative laying of fibre-optic cable in the 1990s. The potential rewards remain so tantalising that capital continues to flood in. Cheerleaders for the build-out, including OpenAI's Sam Altman, argue that the risk of underbuilding and missing the long-term economic potential of AI is as serious as the risk of overbuilding.
The industry is attempting to mitigate risks through new insurance products, securitisations, and complex webs of vendor financing from tech giants like Nvidia. However, these very interlinkages could increase the vulnerability of the entire AI ecosystem if a downturn occurs. Ultimately, the most bewitching uncertainty remains: when will the demand for generative AI applications finally catch up to the monumental ambitions—and investments—of its suppliers?