AI in self-driving cars – sci-fi no longer
Intelligent machines powered by artificial intelligence (AI) computers that can learn, reason and interact with people and the surrounding world are no longer science fiction. Thanks to a new computing model called deep learning using powerful graphics processing units (GPUs), AI is transforming industries from consumer cloud services to healthcare to factories and cities.
Many of these are in place already, providing new services to millions around the world. However, no industry is poised for such a significant change as the $10 trillion transportation industry. The automotive market is next, and the opportunity to develop advanced self-driving vehicle holds the promise to the world of dramatically safer driving and new mobility services.
However, self-driving is an immense challenge. Creating a safe, reliable self-driving car is a particularly daunting one. It’s just not possible to program a computer to safely handle the nearly infinite variety of situations that a vehicle encounters on the road. A great new opportunity for addressing this problem is represented by AI, which allows vehicles to be trained from past experience, enabling them to learn much faster and with higher accuracy than hand-coded software.
Ensuring cars can react correctly in a fraction of a second to changing circumstances requires interpreting the torrent of data rushing at it from a vast range of sensors, such as cameras, radar, lidar and ultrasonic. First and foremost, this requires a massive amount of computational horsepower.
The first step in making use of this data is perception – that is, understanding what the sensors pick up and the cameras see. This is best achieved through an AI technique called deep learning, which is used to train cars to recognize different types of objects. Much like a child learns through experience, AI systems can be trained to recognize cars, trucks, pedestrians, bikers, trees, unexpected deer and the like by feeding deep neural networks millions of images of objects and scenes.
Another critical aspect is reasoning – enabling the self-driving car not only to understand its environment but also to anticipate how the next moments play out, so that it can proceed on a safe, comfortable path forward.
A third aspect required for self-driving cars is a highly detailed or HD map. In order to drive with precision, the car needs to know exactly where it is at all times. Based on what we are sensing, a car can combine knowledge of the known map to understand its location with centimeter accuracy. Any differences between the map stored in the cloud and the real world will be communicated back to the cloud to update the HD map. Again AI plays a major role in creating and maintaining these highly detailed mapping data sets.
Once we know where our car is located, what the topology of roadway looks like, and are tracking all the objects moving in the scene (i.e. cars, trucks, pedestrians, bicyclists), we can then plan a safe path forward. This aspect of the self-driving pipeline also incorporates AI, as we can train neural nets to understand and anticipate human behavior.
Collaboration between Bosch and NVIDIA helps advancing self-driving car technology for the global auto market
To achieve the goal of developing and deploying an advanced AI self-driving computer to every part of the globe, Bosch and NVIDIA will collaborate to develop AI self-driving computers for production cars, based on NVIDIA’s AI DRIVE PX 2 platform and software that integrates Bosch radar and other sensors. Building on advanced AI technologies from NVIDIA, Bosch is adding 130 years of experience in automotive.
AI technology for self-driving cars
“So far, driver support technology was developed evolutionary, i.e. module by module (e.g. lane keeping, distance keeping, etc.),” adds Michael Fausten, VP Vehicle Systems Engineering and Automated Driving at Bosch. He is convinced, that the rapidly maturing AI technology has the potential to revolutionize the automotive industry by providing a holistic autonomous driving platform: “AI and deep learning are core to this, and a central feature will be the capability to continually and safely update new algorithms and models on the cars, even in the field. A great advantage of AI is that the system gets continuously smarter with each additional experience. As new experience takes place, a new neural network can be loaded into the car, giving it new capabilities, and higher performance.”
But this is not all – we also have to look at the required infrastructure and processes to make this work on the global level. This is where a partnership like between Bosch and NVIDIA is adding great value. In the upcoming months, we need to work on aspects such as:
- How to develop and integrate the best sensor technologies as input feeds to AI?
- How to connect AI on the car with car systems and intelligent solutions in the cloud?
- How to deal with the massive data volumes we need to capture for each new autonomous car model during the test phases?
- How to leverage this data to ensure validation in different regulatory environments?
- How to manage the impact of autonomous vehicle fleets on the existing service and repair infrastructure?
These questions will have to be answered step by step, as we gain experience with these new technologies. A first milestone for us will be Bosch ConnectedWorld 2017, where Jen-Hsun Huang, NVIDIA Co-Founder and CEO, NVIDIA will discuss our joint experience thus far!