As autonomous driving technology allows vehicles to become more independent of their human drivers, advanced driver assistance systems (ADAS) will not be able to rely on today's distributed processors and smart sensors, according to a new report.
The report, from ABI Research, forecasts that 13 million vehicles with centralized ADAS platforms will ship in 2025.
As more vehicle systems -- such as acceleration, braking and steering -- are taken over by computers, and as more sensors and machine vision systems (cameras) are added, automakers will need to adopt new platforms based on powerful, centralized processors and high-speed low latency networking.
"The new centralized ADAS architectures will unify sensing, processing, and actuation to deliver integrated decision-making for smooth path planning and effective collision avoidance," said ABI Research analyst James Hodgson.
System-on-chip (SOC) makers and other auto industry suppliers have already begun announcing centralized autonomous driving platforms with common themes. These include vastly more powerful processing platforms that average between 8 and 12 teraflops (TFLOPs), which is orders of magnitude beyond the typical smart sensor currently deployed in ADAS, Hodgson said.
Several vendors have already announced centralized autonomous driving platforms. For example, NXP announced BlueBox, Mobileye released its EyeQ5 SOC and NVIDIA announced its Drive PX 2, liquid-cooled supercomputer for cars with 2.5 teraflops of processing power.
Delphi Automotive and Mobileye also announced a partnership to co-develop what they described as the market's first turnkey fully autonomous driving system for automakers.
Mobileye also has separate self-driving technology partnerships with both BMW and Intel to produce self-driving cars by 2021.
Mobileye has "partnerships here, there and everywhere. It's quite curious, and I'm not sure what they're attempting to achieve," Hodgson said. "I always take supply agreements with a grain of salt because without [carmakers buying in] it only will go so far."
In some ways, the processing capabilities of self-driving cars will depend on outside network communications. For example, vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communications will provide insight into what other autonomous vehicles plan to do next as well as enable cars to see around the next corner.
While autonomous technology has come a long way in a relatively short time, there are still issues that need to be addressed. These include a vehicle's underlying telematics infrastructure, and carmakers have recognized that challenge.
One of the earliest examples of a centralized autonomous driving system developed by a carmaker was Audi's trunk-filling zFAS controller, which it released last year and developed with NVIDIA and Delphi, Hodgson said.
"We are fast approaching the end of what can be achieved in automation within the confines of legacy architectures," Hodgson said. "Vendors across the ecosystem need to take this time to plan accordingly in order to appropriately manage the industry transition toward centralized ADAS architectures."
This story, "Why distributed networks and smart sensors won’t survive in self-driving cars" was originally published by Computerworld.