Nvidia launches Alpamayo, open AI models that allow autonomous vehicles to‘think like a human’

At CES 2026, Nvidia launched Alpamayo, a new family of open-source AI models, simulation tools, and datasets designed for training physical robots and vehicles. This initiative aims to help usher in a new era where autonomous vehicles can reason through complex driving situations.

Nvidia CEO Jensen Huang declared this development the ChatGPT moment for physical AI, a time when machines begin to understand, reason, and act in the real world. He stated that Alpamayo brings reasoning to autonomous vehicles, allowing them to think through rare scenarios, drive safely in complex environments, and even explain their driving decisions.

The core of this new family is Alpamayo 1, a 10-billion-parameter model. It is a chain-of-thought, reason-based vision language action model that enables an autonomous vehicle to think more like a human. This allows it to solve complex edge cases, such as navigating a traffic light outage at a busy intersection, without prior specific experience.

Ali Kani, Nvidia’s vice president of automotive, explained that the model works by breaking down problems into steps, reasoning through every possibility, and then selecting the safest path. Jensen Huang further elaborated that Alpamayo not only processes sensor input to control the vehicle but also reasons about and explains the actions it is about to take.

The underlying code for Alpamayo 1 is publicly available. Developers can fine-tune it into smaller versions for vehicle development, use it to train simpler driving systems, or build tools on top of it. These tools could include auto-labeling systems for video data or evaluators to check a vehicle’s decisions.

Developers can also use Nvidia’s Cosmos, a brand of generative world models, to create synthetic data. They can then train and test their Alpamayo-based applications on a combination of real and synthetic datasets.

As part of the Alpamayo rollout, Nvidia is releasing an open dataset with over 1,700 hours of driving data. This data was collected across various geographies and conditions and covers rare and complex real-world scenarios.

The company is additionally launching AlpaSim, an open-source simulation framework for validating autonomous driving systems. Available on GitHub, AlpaSim is designed to recreate real-world driving conditions, from sensors to traffic, so developers can safely test their systems at scale.