Nvidia stock has soared on the tech giant's seemingly impregnable moat in artificial intelligence. But the spotlight on potential disrupters keeps growing, particularly on startups eyeing emerging AI challenges where they think Nvidia may be vulnerable.
The upstarts focus on critical questions. Are massive investments in AI data centers powered by Nvidia chips worth it? Does it make sense for the tech industry to rely on just one chipmaker for the hottest trend in tech?
AI chip startup Cerebras, which just filed to go public, vowed to "accelerate AI by making it faster, easier to use" and "accessible around the world." Rodrigo Liang, CEO of another Nvidia challenger SambaNova, also cited the need for greater access to AI, as he noted growing worries about "skyrocketing" power costs and diminishing returns.
"You've got to drive the cost down to drive performance, enable developers to innovate and to do it at scale," he told Investor's Business Daily. "If you don't do that, it's just a game that only the large players can play."
"We are the alternative to Nvidia," Liang declared.
Nvidia Stock: Alternatives On The Rise
Aspiring to be an Nvidia alternative has been the goal for many startups since the chip behemoth began dominating AI with the rise of deep learning a decade ago.
That happened unexpectedly.
Deep learning, which builds artificial neural networks that mimic the way human memory works, requires heavy-duty processing power. Nvidia's graphics processors, used mainly for video games and Hollywood blockbusters, turned out to be the chip that could do that job very well.
"Deep learning fell in their lap," Naveen Rao, who cofounded AI chip startup Nervana in 2014 and is now a DataBricks executive, said in a 2019 interview.
Focus On Training
Nvidia's chips proved particularly ideal for a critical step in deep learning, the training of AI models by using heavy-duty computing to process massive amounts of data.
Training an AI model is "very data intensive, very processing intensive," IDC analyst Shane Rau told IBD. "You throw a million pictures of cats at (the model), and say, 'Recognize a cat.'"
In the next deep learning phase, called inference, the model executes on the knowledge developed in the model. "You turn that model around, and then you're asking the model to identify cats, you're serving up one picture of a cat at a very fast clip," Rau said.
The industry refers to inference as "AI in production," or "AI in action."
"Training is like going to college. Inference is like going to work," Tony Kim, managing director at BlackRock, a SambaNova investor, told IBD.
Inference is expected to become an increasingly bigger need in AI. And startups see a huge opportunity.
Nvidia Stock: Targeting Inference
The pivot to inference would require more customized chips geared to specific tasks. These processors are expected to consume much less power and to be less expensive.
"As models start to settle down … more folks are going to minimize investment in training and do inferencing," Rau said.
That push for lower cost AI computing could trigger more competition.
"Not everybody's going to spend $30,000 for a GPU," Rau said. "If you can run an AI model (on chips) that might be more specialized and cheaper, that's a good goal."
'You Need A Different Chip'
That's the opening SambaNova is aiming for, Liang said. The company's website proclaims SambaNova as "The World's Fastest AI Inference."
The Palo Alto, Calif.-based company has raised more than $1.1 billion in venture funding from investors that include BlackRock, GV, Intel Capital and SoftBank Vision Fund. Among its top customers are Saudi Aramco, Lawrence Livermore Labs and Argonne National Laboratories. The startup, which has 600 employees, was valued at more than $5 billion in 2021.
Liang echoed the criticism that while Nvidia's powerful and power-hungry chips have been perfect for training, a different kind of processor is needed for inference.
"You cannot continue to brute force your way through," he said, arguing for the need "to find smarter ways, more efficient ways of actually going to scale."
"You need a different chip," Liang said. "That's a huge market for us."
Nvidia Stock: Overdependence On One Vendor
Kim of BlackRock sees the same opportunities, saying "inference AI running in production is going to proliferate like crazy. This is where these startups have found life. This inference demand is starting to explode."
BlackRock has a policy of not offering comments on specific companies. Kim offered his own personal views on players in the AI market.
Companies looking to embrace AI face a key dilemma, he said. The "investment required is so massive" and this "created even more dependency on one vendor."
Nvidia's technology is, no doubt, impressive, Kim said. But its dominance prompted "a whole bunch of other companies that say, 'Hey, we need to build alternatives. Can we rely 100% on one company?"
"No company would want to be 100% relying on one company," he said.
Nvidia Stock: Debate On Inference Market
The potential size and significance of the AI inference market remains unclear.
Bernstein analysts, in an October client note this month, speculated that it would be "very large" but it all depends on the return on investments on future AI models. "We have no answers here — while we are generally believers, it is hard to have conviction either way," the Bernstein analysts wrote.
Meanwhile, Nvidia isn't exactly standing still. Chief Financial Officer Colette Kress told analysts in August that "over the trailing four quarters, we estimate that inference drove more than 40%" of Nvidia's data center revenue.
An Nvidia spokesperson declined further comment.
Taking On The Leader
Nvidia will likely remain the dominant AI powerhouse. But the AI market is still evolving.
"As most models grow, they grow our markets, they grow, grow, grow and grow, and eventually they segment," Rau said.
Kim of BlackRock said it would be "very very hard for anyone to crack into the training" space now dominated by Nvidia. Plus, other chip giants, led by Advanced Micro Devices, and the big cloud companies like Microsoft and Google have also joined the fray. "The odds are still stacked against you, even with this market that's so dramatically big."
Still, the AI market will be huge, he said, poised to grow "100x or 1000x."
"A few startups, I think, can make it," Kim added.
'Hardware Is Hard'
That wasn't true for Nervana which emerged as a potential Nvidia challenger and was acquired by Intel in 2016. The chip giant eventually shut down Nervana's efforts a few years later, after acquiring another AI chip startup, Habana.
But the risks in exploring an Nvidia alternative are worth it. "There's room for more than just one company called Nvidia," Kim of BlackRock argued.
Andrea Schulz, managing partner at the financial advisory firm Grant Thornton, said AI chip startups must grapple with another serious hurdle: It's typically harder for new companies to enter hardware tech markets.
"Hardware is hard," she told IBD. "You see that phrase tossed around a lot, and that's very true."
She said AI chip startups' long term prospects remains uncertain.
It's unclear "whether or not they will disrupt the existing players and whether they have the longevity to actually become established players in the ecosystem, or if they're actually going to get acquired along the way," she said. "I think that's the more likely path.
The SambaNova Dance
Meanwhile, Liang remained upbeat about the company he cofounded in 2017 which he expects to play a key role in what's expected to be a long battle.
"You don't win overnight," he said. "You just got to stay in it for a long time. And as we've seen, not everybody can stay in it for a long time."
In a way, the battle over AI is like a protracted dance, where players must go with the flow. And that's what SambaNova is all about, Liang said.
Liang, who grew up in Brazil, recalled how in deciding what to call the startup, he "wanted to find a name that was Brazilian in nature."
His team settled on "SambaNova" which Liang described as "very, very, perfectly matched" for the neural networks they're building.
"Samba is a dance," he said. "It flows. The data flows. We ended up liking that name."
Liang also draws lessons from the dot-com era when his tech career took off and when another chip behemoth seemed invincible.
During that time "no one thought they could disrupt Intel," he said. "Intel was 90-plus of everything that everybody was running. Well, it's not the reality anymore."
"The best tech wins," he said. "I think it's as simple as that: the best tech wins. I don't think it's about brand. You have to produce the best technology at scale. People want power efficiency, cost efficiency, performance. If you do those things, the best tech wins. And that's really what we're optimizing for."