NVIDIA has announced a new AI computing platform that the company claims will make the factory production of fully autonomous vehicles possible.
Codenamed Pegasus, the system compresses four AI processors into a device small enough to carry, which the company says provides for all the computational requirements of a self-driving car. NVIDIA plans to initially harness this power to create a new class of automated taxis.
“The Pegasus is the world’s first computer designed for production deployment of robotaxis,” NVIDIA’s founder and CEO Jensen Huang said at GTC Europe 2017 in Munich, the home of BMW.
Driverless cars are normally categorised by one of five levels of sophistication.
Level-zero vehicles rely on the full human control that’s been used since cars first became available to the public more than a century ago. Level one adds automation to a specific function that still requires the support of someone behind the wheel, such as acceleration. Level two extends this to a single function that needs no human support, such as lane-centering.
Level three means that the system can monitor the surrounding environment and make certain decisions entirely independently, for example, when to overtake. Level-four vehicles are capable of completing a specific driving task.
But level five is the ultimate goal of a fully autonomous vehicle without steering wheels, pedals or mirrors. Nvidia claims Pegasus will make this objective a reality.
These vehicles require enormous computational power.
A vast number of radars, lidars, sensors and high-resolution, 360-degree surround cameras are needed to accurately track surroundings, localise the vehicle and plan and control the journey. They generate a quantity of information that’s equivalent to a small data centre. Read more