Few things are more certain than Jim Keller, ex-AMD and ex-Tesla engineer and current CEO of AI computer company Tenstorrent, making bold claims in public appearances. There is at least one thing more more certain than this, however, this being that Nvidia is firmly cemented as the king AI hardware. Combine the two certitudes and you have Keller, in a recent DemystifySci podcast, stating that “Nvidia is slowly becoming the IBM of the AI era.”
According to Keller, for AI, Nvidia currently has “the best processors by functionality and obvious proof points,” which has meant that “all the big tech companies are in an arms race and they’re all calling Nvidia to get allocation” for their new AI processors. This, at least, is certainly true.
2024’s exploding AI market almost feels like the stirrings of a brave new world for tech. We don’t need futurist Kurzweils persistent and wacky techno-optimism to see that. We also don’t need Keller’s industry experience to see Nvidia’s gigantism. We know it just from the numbers. For instance, Nvidia raked in over $26 billion in Q1 2024, with over $22 billion of this coming from AI datacentre demand. And Nvidia CEO Jensen Huang figures that AI constitutes the “next industrial revolution” that has Nvidia at the centre.
Nvidia’s foresight to pivot to AI early on, combined with infrastructures already in place, put them in the prime spot for AI market dominance. No other chip foundry can match demand like Nvidia can. And with big shots like Microsoft, OpenAI, and Meta all wanting a slice of the burgeoning but not-quite-there-yet AI pie, there’s good reason to suppose that Keller’s right and Nvidia could become to AI what IBM was to computers… a few decades ago.
After achieving dominance in the mainframe market for businesses, in the 1980s IBM went on to spawn the personal computer, for a while being the only real game in town for PCs. If you said “PC,” you were talking about IBM. That’s what Keller seems to have in mind when he talks about Nvidia and AI.
Putting aside the obvious concerns over monopoly, to follow through on Keller’s analogy we shouldn’t forget what happened with IBM back in the 1990s and early 2000s. Its PC dominance was short-lived, in part thanks to its own decisions. Microsoft made the operating system for IBM PCs, IBM allowed them to sell the MS-DOS OS to other companies, and that was that: Companies started to associate “PC” with Microsoft. IBM started divesting its production in the 1990s, and in 2005 it sold its PC division to Lenovo.
Now, I’m not saying the same fate is in store for Nvidia, but it is important to remember that monopolistic markets rarely remain thus, and Keller’s own analogy should remind us of it.
Story continues
This is even more important to remember in such an incredibly new market as the AI one. We don’t know how any of it will turn out, but there are already intimations that the AI datacentre market is built on slippery foundations.
AI, explained
What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.
Sequoia analyst David Cahn (via Tom’s Hardware) reckons that, to pay for the AI infrastructure they’ve erected, AI companies need to earn about $600 billion per year, which even optimistic projections say is impossible. This could hint (or perhaps scream) the growth of a financial bubble that nobody wants to see pop.
But—and not to sound like a broken record, here—the AI market is new. As in, completely fresh-out-of-the-oven-and-scalding-hot new. And if it does mark the next industrial revolution, we can’t rule out changes and innovations that give AI companies all the revenue they need.
In which case, if Nvidia had the foresight to get in on the booming AI chip fabrication market before anyone else, maybe it also has the foresight to grow a scary bubble that unforeseen revenue-generating innovation will fill.
Or, perhaps Keller’s IBM comments will be even more prescient than he maybe realised. I suppose we’ll see.