Using Nvidia’s Drive PX2, the Robocar can reach speeds of 199mph, all controlled by Artificial Intelligence
Autonomous vehicles have slowly been entering into society, and these cars have to contend with pedestrians, road work, traffic signals, unpredictable situations and weather. Autonomous vehicles use artificial intelligence algorithms and data, which is fed from a plethora of sensors and cameras to make intelligent decisions on how to mange the road ahead.
During Nvidia’s GTC 2016 keynote address, the company unveiled a new Formula E (auto racing that uses only electric-powered cars) event dubbed Roborace. At Mobile World Congress 2017, then CEO of Roborace Dan Sverdlov, provided more details on Robocar, which is “an emotional connection to driverless cars and bring humans and robots closer together to define our future.”
The organisers want to see 10 teams compete with 20 driverless full-size (1000kgs / 4.8m in length and 2m wide) cars using Nvidia’s Drive PX 2 supercomputers. While the cars in the race will all be identical, each team has the freedom to develop their own software algorithms, which should result in different tactics being deployed on the tarmac.
According to the company:
“Roborace’s open A.I. platform allows companies to develop their own driverless software and push the limits in an extreme and safe environment. The series is designed to be a competition of intelligence so all teams will use the same “Robocar” to ensure all efforts will be focused on advancing the software for everyday road cars to adopt.”
For nearly two years, the company has been looking at various technologies in order to power the impressive car.
Dan Simon, who was the Chief Designer, who has previously worked on vehicles for movies such as Tron: Legacy and Oblivion, along with the 2011 HRT Formula One car told Motorsport; “We’re living in a time where the once separated worlds of the automobile and artificial intelligence collide with unstoppable force.”
“My goal was to create a vehicle that takes full advantage of the unusual opportunities of having no driver without ever compromising on beauty,” explained Simon in a Topgear post, “Racing engineers and aerodynamicists have worked with me from the beginning to strike that balance.”
Devbots are the second iteration from Roborace, and is the core development of the platform. The car has four 135KW electric motors, which provides all wheel drive and utilises Nvidia’s Drive PX2, a multi-chip configuration with four high performance AI processors — delivering 320 trillion deep learning operations per second (TOPS) — that enable Level 5 autonomous driving.
The NVIDIA DRIVE platform combines deep learning, sensor fusion, and surround vision to change the driving experience.
Devbot makes use of 5 LiDAR sensors, six AI cameras and 18 ultrasonic sensors as well as GNSS positioning to reach speeds of 199mph (320kph). This allows algorithms to accurately understand the full 360-degree environment around the car to produce a robust representation, including static and dynamic objects. Use of deep neural networks for the detection and classification of objects dramatically increases the accuracy of the fused sensor data.
How is the engine performing? Are the tires gripping? What about the brakes? Acceleration? Vibrations? Telemetry? Fuel? What about the wind, rain, and temperature? All this data must be computed, assessed and managed and using bots coded with algorithms can process this data many times faster than humans.
Roborace has many technical challenges, and it’s all centred around the hardware and software. The sensors’ ranges, travelling at high speeds, need to be accurate in a fraction of the time, when compared to normal road autonomous vehicles.
According to an article in Scientific America, Huei Peng, a University of Michigan mechanical engineering professor notes that “stationary LiDAR can easily figure out the absolute position of every reflecting object, but on a fast-moving vehicle the software must account for how the world will appear blurred due to the distance traveled between measurements.”
The algorithms that are interpreting this data, which are generated by the cameras and the sensors, are being processed in real-time, but in most cases, since the car is travelling at high speeds, the distance travelled for the equations computed mean that the car may be too late to react and may end up in the wall.
In an article posted by theEngineer, Bryn Balcombe explained that “The primary focus is on the ability of the AI Drivers to perceive and act within the dynamic environments that we create, and if an AI Driver is more accurate in perception, it has a better chance of taking the correct actions.”
If the software and hardware developers manage to figure this out, the developers of Roborace want to licence the technology to standard autonomous car manufactures. Chris Gerdes, a Standard University professor of mechanical engineer said in a telephone conversation with Scientific America, “By studying racing, we can learn a lot about how you would control a car through adverse conditions that you might get in everyday driving.”
In February 2017, Roborace had its first ‘real race’ at the circuit of Formula E in Buenos Aires ePrix, with the two cars travelling at a cautious pace and a top speed of 115mph. Unfortunately one of the cars got into an accident and was withdrawn. The event emphasised that driverless technology is still a developing field – one that could struggle to compete with the high-speeds of Formula One. Other test tracks included Michelin’s testing ground in Ladoux and the Silverstone Stowe Circuit, and in New York further test runs are planned for additional Formula E events.
Roborace is still in development as a series, but the goal is to have several teams racing against each other, each with the same car design, but writing their own software.
Artificial intelligence is constantly improving. No matter how sophisticated it becomes, however, we’re not at the point where AI can be as interesting, engaging, or as impulsive as a human.
While traditional car racing has always been about engineering, and performance, and a driver has skills and personality, Roborace most certainly, will change the status quo. There is nothing like a throw-down of technology, having high tech motor-geeks at the helm, becomes more about programming skills and machine learning engines which changes the fabric of racing. Despite describing the Robocars as “fascinating,” Mercedes-AMG’s Team Principle, Toto Wolff highlightedthe importance of a human driver over AI as the key to exciting racing: “Formula 1 is an engineering world championship and a driving world championship. We like the gladiators in the car. With that bit missing, it’s something completely different.”
Despite widespread concerns and cynics, there are benefits to a driverless race series. Formula 1 driver, Nick Rosberg said in an interview with TechRadar, “It’s the drivers that are the heroes. It’s not about the cars, it’s about the drivers.”
“For the road it’s an important concept for the future that needs attention. Even I would like the occasional driverless car because sometimes I just don’t want to be driving.”
Roborace is all about computer programmers whose code unleashes speed, efficiency and and precision in order to take the checkered flag.
To quote Roborace’s new CEO Lucas Di Grasse, “Motorport will have to split into the pure sport, and a technology driven series, and we want to be this technology driven series.”
There’s no official date on when Robocars will be ready to actually race. The organiser’s optimistic idea to field 20 driverless, fully autonomous cars to actually host a race in 2018 feels highly unlikely at this point.
It will be hard to predict how successful Roborace will be, but I can certainly say that this ‘sport’ will find a niche and a legion of fans, and besides it should be fun to see how AI-powered racing can mature over the decades.
Resources & Citations: