Nvidia is a “technology company,” not a “consumer” or “enterprise” company, as emphasized by CEO Jensen Huang. What does he mean, exactly? Doesn’t Nvidia want consumers to spend hundreds or thousands of dollars on the new, expensive RTX 50-series GPUs? Don’t they want more companies to buy their AI training chips? Nvidia is the kind of company with a lot of fingers in a lot of pies. To hear Huang tell it, if the crust of those pies is the company’s chips, then AI is the filling.
”Our technology influence is going to impact the future of consumer platforms,” Huang—clad in his typical black jacket and the warm bosom of AI hype—said in a Q&A with reporters a day after his blowout opening CES keynote. But how does a company like Nvidia fund all those epic AI experiments? The H100 AI training chips made Nvidia such a tech powerhouse over the past two years, with a few stumbles along the way. But Amazon and other companies are trying to create alternatives to cut out Nvidia’s monopoly. What should happen if competition cuts the spree short?
“We’re going to respond to customers wherever they are,” Huang said. Part of that is helping companies build “agentic AI,” AKA multiple AI models able to complete complex tasks. That includes several AI toolkits made to throw a bone to businesses. While the H100 has made Nvidia big, and RTX keeps gamers coming back, it wants its new $3,000 “Project Digits” AI processing hub to open up “a whole new universe” for those who can use it. Who will use it? Nvidia said it’s a tool for researchers, scientists, and maybe students—or at least those who stumble across $3,000 in their cup of $1.50 instant ramen they’re eating for dinner for the fifth night in a row.
Nvidia made sure you knew about the RTX 5090’s 3,352 TOPS of AI performance. Then, Huang’s company dropped details on several software initiatives—both gaming and non-gaming related. None of his declarations were more confusing than its “world foundation” AI models. These models should be able to train on real-life environments, which could be used for helping autonomous vehicles or robots navigate their environment. It’s a lot of future tech, and Huang admitted he failed to better articulate it to a crowd who had mostly come to see cool new GPUs.
“[The world foundation model] understands things like friction, inertia, grabbing, object presence, and elements, geometric and spatial understanding,” he said. “You know, the things that children know. They understand the physical world in a way that language models didn’t know.”
Huang opened up CES 2025 on Jan. 6 with a keynote that packed the Michelob Ultra arena in Las Vegas’ Mandalay Bay casino. There was certainly a huge portion of gamers who’d come to see the latest RTX 50-series cards in the flesh, but more were there to see how a company as lucrative as Nvidia moves forward. RTX and Project Digits drew hollers and shouts from the crowd. Spending half his time talking about his world foundation model, the audience didn’t seem nearly as enthused.
It points to how awkward AI messaging can be, especially for a company that bears much of its popularity to the attentive population of PC gamers. There has been so much talk about AI that it’s easy to forget Nvidia was in this game years before ChatGPT came on the scene. Nvidia’s in-game AI upscaling tech, DLSS, has been around for close to six years, improving all the time, and it’s now one of the best AI-upscalers in games, though limited by its exclusivity to Nvidia’s cards. It was good before the advent of generative AI. Now, Nvidia promises Transformer models will further enhance upscaling and ray reconstruction.
To top it off, the touted multi-frame gen could possibly grant four times the performance for 50-series GPUs, at least if the game supports it. That is a boon for those who can afford the new RTX 50-series. The RTX 5090 tops off at $2,000. The gamers who would most benefit from frame gen are those who may only afford a lower-end GPU. Huang declined to offer any hints about an RTX 5050 or 5060, joking “We announced four cards, and you want more?”
The world foundation model is just a prototype, just like much of Nvidia’s new AI software on display to the public. The real questions are, when will it be ready for primetime, and who will end up using it? Nvidia showed off oddball AI NPCs, in-game chatbots, AI nurses, and an audio generator last year. This year, it wants to bloom with its world foundation model, plus a host of AI “microservices,” including a weird animated talking head that’s supposed to serve as your PC’s always-on assistant. Perhaps, some of these will stick. In the cases where Nvidia hopes AI replaces nurses or audio engineers, we hope that doesn’t happen.
Huang considers Nvidia “a small company” with 32,000 worldwide employees. Yes, that’s less than half of the staff Meta has, but you can’t think of it as small in terms of the market influence for AI training chips. Because of its market position, it holds an outsized influence on the tech industry. The more people using AI, the more people will need to buy its AI-specific GPUs, plus any of its other AI software. If everybody buys their own at-home AI processing chip, they don’t have to rely on outside data centers and external chatbots. Nvidia, just like every tech company, just needs to find a use for AI beyond replacing all our jobs.
+ There are no comments
Add yours