NVIDIA has introduced a cloud-based system for testing autonomous vehicles using photorealistic simulation – creating a safer, more scalable method for bringing self-driving cars to the roads.
Speaking at the opening keynote of GTC 2018, NVIDIA founder and CEO Jensen Huang announced NVIDIA DRIVE Constellation, a computing platform based on two different servers.
The first server runs NVIDIA DRIVE Sim software to simulate a self-driving vehicle’s sensors, such as cameras, lidar and radar. The second contains a powerful NVIDIA DRIVE Pegasus™ AI car computer that runs the complete autonomous vehicle software stack and processes the simulated data as if it were coming from the sensors of a car driving on the road.
The simulation server is powered by NVIDIA GPUs, each generating a stream of simulated sensor data, which feed into the DRIVE Pegasus for processing.
Driving commands from DRIVE Pegasus are fed back to the simulator, completing the digital feedback loop. This “hardware-in-the-loop” cycle, which occurs 30 times a second, is used to validate that algorithms and software running on Pegasus are operating the simulated vehicle correctly.
DRIVE Sim software generates photoreal data streams to create a vast range of different testing environments. It can simulate different weather such as rainstorms and snowstorms; blinding glare at different times of the day, or limited vision at night; and all different types of road surfaces and terrain. Dangerous situations can be scripted in simulation to test the autonomous car’s ability to react, without ever putting anyone in harm’s way.
DRIVE Constellation will be available to early access partners in the third quarter of 2018.