The Compute Capital Supercycle: AI’s Silent Infrastructure Revolution

Key Takeaways

  • In 2024, private AI investment surged past $150 billion with a decisive pivot toward infrastructure, signaling a shift from speculative hype to the deep, capital-intensive groundwork needed to scale AI systems.
  • The U.S. has cemented its AI leadership not by building the smartest models alone but by outspending global rivals in critical infrastructure—outpacing China 11-to-1 in private investment and building a formidable economic moat.
  • As AI evolves into a physical, power-hungry technology, success is increasingly defined by who can build and control the complex stack of compute, energy and data—not just who can innovate in code.

In the history of technological progress, there's often a critical misreading. We think the leap is in the product—the engine, the chip, the app. But the deeper truth, time and again, is that the real story is hidden underneath. Progress is made visible by products, but it's powered by infrastructure.

Railroads needed steel. The internet needed fiber. Today's AI revolution? It runs on electricity, silicon and dollars—more of each than most realize.

We've spent the last few years marveling at what these models can do. GPT-4 passed bar exams. Claude writes code. Gemini solves PhD-level chemistry problems. But in the noise of breakthrough after breakthrough, we've overlooked something bigger: the economic architecture required to make it all possible.