Nvidia’s largest customers (e.g. Google, Amazon, Microsoft, Meta, and OpenAI) are developing AI chips that compete with Nvidia, posing a potential long-term threat to the AI leader. Jensen’s response? “We can help you do that.”
Warning: This blog contains speculation that has not been verified…. But it’s fun to think about!
When I worked at AMD, I was always impressed by how the company managed to keep two teams completely isolated from each other to protect customer privacy. One team was designing the next chip for the Microsoft XBox, while the other was designing a chip for the Sony Playstation. Each client had their own intellectual property and gaming console requirements that needed to be protected from the other team. This was a model of success for AMD, which still owns this market.
But all this secrecy can be difficult and costly. And it is difficult to develop this activity. What if the chip supplier let the customer do more of the design work and provided its intellectual property for inclusion in the customers’ chips? And of course, the customer could leverage the suppliers’ relationship with TSMC or Samsung for manufacturing to reduce costs and improve time to market.
So it should come as no surprise to anyone that Nvidia has announced that it has formed a group tasked with forging this new business model, helping customers build their own solution using Nvidia IP or perhaps even chipsets. Nvidia is still up 3% on the news.
Maybe Nvidia didn’t need to buy Arm after all. With this move, it begins to build an AI licensing giant.
I’m sure we’ll hear more about this at GTC next month, but here’s our take.
What is “custom silicon” and how would Nvidia exploit this opportunity?
Many companies that design their own chips to reduce costs or provide a more customized solution to their computing needs are already partnering with companies like Broadcomm and Marvell for back-end physical design, SerDes blocks, or IP such as Marvell’s high-performance Arm processor cores. And EDA vendors like Cadence and Synopsys are doing good business providing blocks of IP that SOC designers can insert into their chips, saving money and speeding time to market. But this is not new news. Sima.ai, for example, uses an image processor from Synopsys in its Edge AI chip.
Startup Tenstorrent, led by Jim Keller, saw this opportunity present itself and took the Toronto and Austin-based company from a potential Nvidia competitor to an intellectual property and design shop, supplying chipsets and intellectual property to companies like Kia and LG.
In the world of AI, we are seeing a new trend: designers of TVs, cars or network equipment want to create a tailor-made solution to reduce costs or provide a differentiated solution including AI, but they do not have neither the need nor the expertise to do so. build the entire chip. Google, Amazon AWS, Meta (which are expected to use their own chip later this year), and Microsoft Azure already have their own custom chips for in-house AI, as well as their Nvidia GPUs for cloud clients. Could they partner with Nvidia for future designs?
Here’s an idea…
Could these Nvidia custom chip customers leverage Nvidia’s in-house supercomputers and AWS to accelerate and optimize these design efforts? This would be a great source of additional revenue as well as an incredible differentiator. If so, this could be why Nvidia is hosting its latest “in-house” supercomputer, Project Cieba, in AWS data centers, where the infrastructure for secure cloud services is already available. Nvidia could offer chip design optimization services on Cieba.
While this speculation may be a bridge too far, it would indicate that Nvidia sees the writing on the wall and is already preparing to transform the industry once again.
Conclusions
Okay, maybe I went too far in my speculation. But if any company can pull it off, it would be Nvidia. All technologies become more commonplace over time, especially previous generations of silicon. When Nvidia was courting Arm, I often said that the acquisition would give Nvidia the ability to monetize what they don’t want to produce, through licensing deals.
Looks like that’s exactly what Nvidia is doing right now.
Disclosures: This article expresses the opinions of the author and
should not be considered advice to purchase or invest in any companies mentioned. Cambrian AI Research is fortunate to have many, if not most, semiconductor companies as clients, including Blaize, BrainChip, CadenceDesign, Cerebras, D-Matrix, Eliyan, Esperanto, FuriosaAI, Graphcore, GML, IBM, Intel, Mythic, NVIDIA, Qualcomm Technologies, Si-Five, SiMa.ai, Synopsys, Ventana Microsystems, Tenstorrent and many investment clients. We have no investment position in any of the companies mentioned in this article and do not plan to initiate any in the near future. For more information, please visit our website at https://cambrian-AI.com.