NVIDIA Physical AI: Scaling Robotics with Omniverse and Digital Twins
- NVIDIA has introduced a suite of tools and models designed to transition physical AI from isolated use cases to sophisticated enterprise workloads.
- Central to this development are new frontier models for physical AI, specifically NVIDIA Cosmos 3, NVIDIA Isaac GR00T N1.7, and NVIDIA Alpamayo 1.5.
- The data factory blueprint is built on NVIDIA OSMO and NVIDIA Cosmos open world foundation models.
NVIDIA has introduced a suite of tools and models designed to transition physical AI from isolated use cases to sophisticated enterprise workloads. During the NVIDIA GTC event, the company showcased a framework where robots, vehicles, and factories are integrated into broader industrial operations through the use of digital twins and frontier AI models.
Central to this development are new frontier models for physical AI, specifically NVIDIA Cosmos 3, NVIDIA Isaac GR00T N1.7, and NVIDIA Alpamayo 1.5. These models are supported by the NVIDIA Physical AI Data Factory Blueprint, an open reference architecture that allows developers to transform compute into high-quality training data.
The data factory blueprint is built on NVIDIA OSMO and NVIDIA Cosmos open world foundation models. It unifies data curation, augmentation, and evaluation into a single pipeline, which enables the generation of diverse datasets from limited real-world inputs. This approach aims to solve the scalability issues associated with real-world data, which is often unpredictable and fragmented.
Industrial Simulation and Digital Twins
NVIDIA also released the Omniverse DSX Blueprint, a reference architecture that unifies simulation across all layers of an AI factory. This single digital twin allows operators to optimize performance and efficiency for power grids, network loads, thermals, and mechanical systems before physical hardware is installed.

For large-scale facility management, the NVIDIA Mega Omniverse Blueprint provides a reference architecture for designing and testing robot fleets and AI agents. This allows enterprises to validate operations in a physically accurate digital twin before deploying robots on the factory floor.
KION, in collaboration with Siemens and Accenture, is utilizing this blueprint to create warehouse digital twins. These simulations are used to train and test fleets of NVIDIA Jetson-based autonomous forklifts for GXO, a contract logistics provider.
The Role of OpenUSD and Agentic Frameworks
The scalability of these systems relies on OpenUSD, a common scene-description language. OpenUSD allows teams to integrate computer-aided design (CAD) data, real-world telemetry, and simulation assets into a shared, physically accurate view of the world.
To move from design to deployment, developers use the NVIDIA Omniverse Kit software development kit and NVIDIA Isaac Sim to optimize 3D data for real-time rendering and collaborative workflows. Companies such as Fauna Robotics and FANUC are employing this CAD-to-OpenUSD workflow to accelerate the validation of robotic systems.
open-source agentic frameworks like OpenClaw are extending the AI stack into operations. OpenClaw utilizes tools, memory, and messaging interfaces to orchestrate workflows and execute tasks autonomously on dedicated machines.
Peter Steinberger, creator of OpenClaw
With NVIDIA and the broader ecosystem, we’re building the claws and guardrails that let anyone create powerful, secure AI assistants,
Ecosystem Integration and Adoption
Several industrial robot manufacturers are integrating NVIDIA Omniverse libraries and Isaac simulation frameworks to validate production lines. These include ABB Robotics, KUKA, Yaskawa, and FANUC, who collectively have a global install base of over 2 million robots. These companies have also integrated NVIDIA Jetson modules into their controllers for real-time AI inference.
The Physical AI Data Factory Blueprint is being adopted by a variety of developers to speed up autonomous vehicle programs and vision AI agents, including:
- FieldAI and Skild AI, who are using NVIDIA Cosmos world models for data generation.
- Hexagon Robotics and Linker Vision.
- Milestone Systems and Teradyne Robotics.
Cloud platforms Microsoft Azure and Nebius are the first to offer the data factory blueprint, providing the compute infrastructure necessary to turn these blueprints into data production engines.
Rev Lebaredian, vice president of Omniverse and simulation technologies at NVIDIA
Together with cloud leaders, we’re providing a new kind of agentic engine that transforms compute into the high-quality data required to bring the next generation of autonomous systems and robots to life,
Generalist AI is also utilizing NVIDIA Cosmos to explore the generation of synthetic data, which is intended to help robots become proficient in tasks ranging from food delivery to supply chain monitoring.
