Nvidia CEO Jensen Huang predicted that sometime we’ll have a billion vehicles on the street and they’re going to all be robotic vehicles.
It feels like science fiction, however as Huang has stated earlier than, “I am science fiction.” He made the feedback in a convention name with analysts about Nvidia’s FYQ4 earnings ending January 26, 2025. (Right here’s our full report on the earnings). Nvidia’s inventory is present down half a % to $130.72 a share in after-hours buying and selling.
Colette Kress, EVP and CFO, stated within the convention name that the information heart enterprise was up 93% from a 12 months in the past and 16% sequentially because the Blackwell ramp commenced and the Hopper chip gross sales additionally grew. Blackwell gross sales exceeded Nvidia’s expectations, she stated.
“This is the fastest product ramp in our company’s history, unprecedented in its speed and scale,” stated Kress. “Blackwell production is in full gear across multiple configurations and mere increasing supply,
expanding customer adoption. Our Q4 data center compute revenue jumped 18% sequentially and over 2x year on year. Customers are racing to scale infrastructure to train the next generation of cutting edge models and unlock the next level of AI capabilities.”
With Blackwell, it will likely be frequent for these clusters to start out with 100,000 graphics processing models (GPUs) or extra, Kress stated. Shipments have already began for a number of infrastructures of this dimension. Put up coaching and mannequin customization are fueling demand for Nvidia infrastructure and software program as builders and enterprisers leverage methods reminiscent of fantastic tuning, reinforcement studying and distillation to tailor fashions. Hugging Face alone posts over 90,000 derivatives created from the Llama basis mannequin.
The dimensions of put up coaching and mannequin customization is huge and may collectively demand orders of magnitude extra compute than pre coaching, Kress stated. And inference demand is accelerating, pushed by take a look at time scaling and new reasoning fashions like OpenAI o3, DeepSeek and extra. Kress stated she anticipated China gross sales to be up sequentially, and Huang stated China is predicted to be the identical share as in This fall. It’s about half of what it was earlier than export controls had been launched by the Biden administration.
Nvidia has pushed to a 200 occasions discount in inference prices in simply the final two years, Kress stated. She additionally stated that as AI expands past the digital world, Nvidia infrastructure and software program platforms are more and more being adopted to energy robotics and bodily AI growth. On prime of that, Nvidia’s automotive vertical income is predicted to develop as effectively.

Concerning CES, she famous the Nvidia Cosmo World Basis mannequin platform was unveiled there, and the robotics and automotive firms — together with Uber — have been among the many first to undertake it.
From a geographic perspective to potential progress of knowledge heart income was strongest within the U.S., pushed by the preliminary ramp up. International locations throughout the globe are constructing their AI ecosystems, and demand for compute infrastructure is looking France’s 200 billion euro AI funding and the EU’s 200 billion euro funding initiatives supply a glimpse into the construct out set to redefin international AI infrastructure within the coming years.
Kress stated that as a share of complete information heart income, information heart gross sales in China remained effectively beneath ranges seen earlier than the onset of export controls. Absent any change in rules, Nvidia believes that China shipments will stay roughly on the identical stage in China for information heart options.
“We will continue to comply with export controls while serving our customers,” Kress stated.
Gaming and AI PCs

Kress famous that gaming income of $2.5 billion decreased 22% sequentially, and 11% 12 months on 12 months.
Full 12 months, income of $11.4 billion elevated 9% 12 months on 12 months, and demand remained robust all through the vacation. However Kress stated This fall shipments had been impacted by provide constraints.
“We expect strong sequential growth in Q1 as supply increases, the new GeForce RTX 50 series desktop and laptop GPUs are here, built for gamers, creators and developers,” Kress stated.
The RTX 50 Collection graphics playing cards use the Blackwell structure, fifth-generation Tensor cores, and 4th era RT cores. The DLSS4 software program boosts body charges as much as eight occasions the earlier era by turning one rendered body into three.
Automotive income was a report $570 million, up 27% sequentially, and up 103% 12 months on 12 months. Full 12 months, income of $1.7 billion elevated 55% 12 months on 12 months. Sturdy progress was pushed by the continued ramp in autonomous autos, together with vehicles and robotics.

At CES we introduced Toyota, the world’s largest automaker, will construct its subsequent era autos on Nvidia Orin working the security licensed Nvidia Drive. Kress stated Nvidia noticed larger engineering growth prices within the quarter as extra chips moved into manufacturing.
Nvidia expects FYQ1 income to be $43 billion, with sequential progress in information heart income for bot compute and networking.
Nvidia’s subsequent massive occasion is the annual GTC convention beginning March 17 in San Jose, California, the place Huang will ship a keynote on March 18.
Requested a few blurring line between coaching and inference, Huang stated there are “multiple scaling laws” now together with the pre-training scaling legislation, post-training scaling utilizing reinforcement studying, and test-time compute or reasoning scaling. These strategies are firstly and can change over time.
“We run every model. We are great at training. The vast majority of our compute today is actually inference. And Blackwell, with the idea of reasoning models in mind, and when you look at training, is many times more performant,” he stated. “But what’s really amazing is for long thinking, test-time scaling reasoning, AI models were 10s of times faster, 25 times higher throughput.”
He famous he’s extra enthusiastic at the moment than he was at CES, and he famous 1.5 million elements will go into every one of many Blackwell-based racks. He stated the work wasn’t simple however all the Blackwell companions had been doing good work. In the course of the Blackwell ramp, the gross margins will probably be within the low 70s share factors.

“At this point, we are focusing on expediting our manufacturing to make sure that we can provide” Blackwell chips to prospects as quickly as doable, Kress stated. There is a chance to enhance gross margins over time to the mid-70s later this 12 months.
Huang famous that the overwhelming majority of software program goes to be primarily based on machine studying and accelerated computing. He stated the variety of AI startups remains to be fairly vibrant, and that agentic AI for the enterprise is on the rise. He famous bodily AI for robotics and Sovereign AI for various areas are on the rise.
Blackwell Extremely is predicted within the second half of the 12 months as “the next train,” Huang stated. He famous the primary Blackwell had a “hiccup” that price a few months and now it’s totally recovered.
He went to the core benefit that Nvidia has over rivals, saying that the software program stack is “incredibly hard” and the corporate builds its stack from finish to finish, together with the structure and the ecosystem that sits on prime of the structure.
Requested about geographic variations, Huang answered, “The takeaway is that AI is software. It’s modern software. It’s incredible modern software, and AI has gone mainstream. AI is used in delivery services everywhere, shopping services everywhere. And so I think it is fairly safe to say that AI has gone mainstream, that it’s being integrated into every application. This is now a software tool that can address a much larger part of the world’s GDP than any time in history. We’re just in the beginnings.”