At CES 2026, Nvidia unveiled Alpamayo, a new family of open-source artificial intelligence models, simulation tools, and datasets for training physical robots and vehicles designed to help autonomous vehicles reason through complex driving situations.
“ChatGPT’s moment for natural AI is here – when machines begin to understand, reason and act in the real world,” said Nvidia CEO Jensen Huang. “Alpamayo brings logic to autonomous vehicles, allowing them to think through rare scenarios, drive safely in complex environments and explain their driving decisions.”
At the core of Nvidia’s new family is Alpamayo 1, a 10-billion-parameter Visual Language Action (VLA) model that enables an AV to think more like a human so it can solve complex situations — like how to navigate a stoplight at a busy intersection — without prior experience.
“It does this by breaking down problems into steps, thinking through every possibility, and then choosing the safest path,” Ali Kani, Nvidia’s vice president of automotive, said Monday during a press conference.
Or as Huang put it during his keynote address on Monday: “Not only [Alpamayo] to receive the sensor input and actuate the steering, brakes and acceleration, it also explains what action it is going to take. It tells you what action it is going to take, the reasons why that action occurred. And then, of course, the track.”
The underlying code of Alpamayo 1 is available at Hugging Face. Developers can tune Alpamayo into smaller, faster versions for vehicle development, use it to train simpler driving systems, or build tools on top of it, such as auto-labeling systems that automatically tag video data or evaluators that check whether a car has made an intelligent decision.
“They can also use Cosmos to generate synthetic data and then train and test the Alpamayo-based AV application on the combination of real and synthetic data,” Kani said. Cosmos is Nvidia’s brand of generative world models, artificial intelligence systems that create a representation of a physical environment so they can make predictions and take action.
Techcrunch event
San Francisco
|
13-15 October 2026
As part of the Alpamayo release, Nvidia is also releasing an open dataset with more than 1,700 hours of driving data collected across a range of geographies and conditions, covering rare and complex real-world scenarios. The company is additionally launching AlpaSim, an open source simulation framework for validating autonomous driving systems. Available on GitHub, AlpaSim is designed to recreate real-world driving conditions, from sensors to motion, so developers can safely test systems at scale.
