Every bubble has its scaffolding. For dot-coms, it was cheap venture money. For housing, it was credit. For AI, it’s not algorithms, but hardware.
The models we marvel at — ChatGPT, Claude, Gemini — aren’t magic. They’re simply brute force. Billions of parameters multiplied across acres of GPUs, guzzling electricity at industrial scale. Strip away the marketing and what you have is a hardware arms race.
Right now, Nvidia sits at the center. Their chips are the oil of this boom. Companies aren’t “investing in AI” so much as they’re renting compute. The bottleneck isn’t talent or ideas — it’s GPUs. Whoever can hoard the most silicon sets the pace.
The problem is obvious: this isn’t sustainable. Our economy is being propped up by demand for specialized hardware that can’t scale indefinitely. Power grids will choke. Supply chains will lag. And investors will eventually realize that the valuations pinned to “AI” are really just bets on how many boxes of chips a company can get its hands on.
History doesn’t repeat, but it rhymes. The dot-com era wasn’t killed by a lack of imagination, but by a lack of fundamentals. Eyeballs weren’t enough. The AI era risks the same fate: “tokens processed” might turn out to be just as empty a metric.
That doesn’t mean AI goes away. Just like the internet didn’t vanish after 2001, neither will intelligence at scale. But it does mean a reckoning is coming. When the hardware curve flattens, when electricity bills catch up, when investors ask where the profits are — we’ll see which AI companies stand on their own, and which were just scaffolding.
For now, the economy floats on silicon. But you can’t build forever on scaffolding.