
The development of intelligent vehicles requires massive volumes of training data that reflect the real-world diversity vehicles encounter on the road. Sensor simulation addresses this challenge by rendering physics-based sensor data in virtual environments. Building on these physical technologies, World Foundation Models (WFMs) are bringing new innovations to sensor simulation, amplifying features such as lighting, weather conditions, and geolocation.
Equipped with these capabilities, large-scale training, testing, and validation of intelligent vehicles can be conducted without exposing them to rare or hazardous real-world scenarios. The accuracy and diversity of sensor data and environmental interactions are critical to the development of physical AI.
Why Intelligent Vehicle Simulation Matters:

Run Physically Accurate Intelligent Vehicle Simulations at Scale
Developers can start building an intelligent vehicle simulation workflow by following these steps:
Reconstruct Real-World Data in a Digital Twin and Amplify Data Variability
NVIDIA NuRec provides APIs and tools for neural reconstruction and rendering, enabling developers to convert sensor data into high-fidelity 3D digital twins, simulate new events, and re-render datasets from new perspectives.
Cosmos Transfer-1 generates new lighting, weather, and terrain based on photorealistic and structural data inputs, transforming a single driving scene into hundreds of scenarios. Developers can use prompts and sensor data as inputs to create different variants of existing scenes.
Both NuRec and Cosmos Transfer-1 are integrated with CARLA—the leading open-source intelligent vehicle simulator. This integration allows developers to use ray tracing to generate sensor data from Gaussian-based reconstructions, and leverage the Cosmos WFM to add scene diversity.