As autonomous vehicles continue to advance, there is a growing need for better sensors to navigate complex environments and avoid accidents. These sensors must be able to accurately collect and interpret data in real-time and be able to withstand the harsh conditions that vehicles often encounter on the road. Yole Group, a company recognized for its expertise in the analysis of markets, technological developments, and supply chains, discusses the road to autonomy along with improvements required in sensors and system architecture.
The continuous implementation of new functions directly implies a need for more and more sensors from a wider diversity. More ADAS cameras are needed to realize an accurate detection and classification of objects all around the car, as well as lane and traffic sign detection. Radars are key for ensuring good detection of any object around the car, and LiDARs are increasingly added to enhance the precision of the positioning of detected objects and real-time mapping accuracy.
Adrien Sanchez, Technology & Market Analyst, Computing & Software, at Yole Intelligence, stated “The path through car autonomy is certainly ongoing, with the regular addition of new functions for car autonomy. It started in the 2010s with basic functionalities such as Automatic Cruise Control (ACC) and Advanced Emergency Breaking (AEB), it is currently still in progress with the addition of functionalities such as Highway Pilot, and we expect it will continue in the next years with the addition of functions such as City Pilot, where a car can be fully autonomous in a specific area.”
More Sensors, More Data, and More Software Have a Direct Consequence: More Centralization
Pierrick Boulay, Senior Technology & Market Analyst in the Photonics and Sensing Division at Yole Intelligence, commented “On top of this growing number of sensors, which also tend to have higher and higher resolution, the software complexity is increasing sharply. Autonomous driving in an open world is a very difficult problem and reaching a level of security high enough to convince people to put their safety in the hands of a machine is incredibly complex.”
As we’ve seen, this has led to the multiplication of sensors and to a growing number of software layers for more accuracy in the understanding of the environment, as well as to the introduction of some redundancy to prevent crashes due to system failure. In order to handle this growing amount of data and this pipeline complexity, the computing power required has increased dramatically. This has a direct impact on the car architecture, from a decentralized architecture with many small MCUs, to a centralized architecture with a few powerful processors in an ADAS domain controller.
Centralization is the next step for sure, with the need to do sensor fusion. And with the growing number of sensors, nobody wants to conserve a model with one processor by sensors. So, the only question remaining is, what will be the pace of the transformation?
Video creation using smartphones is at an all-time high due to the short-video craze. The emergence of TikTok, the favored social media of the younger generation, has been quickly copied by large incumbents, resulting in YouTube shorts and Facebook reels. This demand for high-quality video hardware was temporarily over-met during the out-of-Covid-19-lockdowns of 2021, and, therefore, the first 3 quarters of 2022 saw slightly less demand. We have seen even more dramatic but similar patterns with computer laptops and tablets in which cameras played a central role during remote work/school teleconferencing.
Another market that has explosive growth right now is Automotive CIS. The Covid-19 era signaled a turning point in consumer behavior, with demand switching to Connected Autonomous Shared and Electric (CASE) vehicles loaded with semiconductor-based features. Overall, the appetite for cameras remains high, but the dominance of the weakened smartphone market translates into the deceptive -0.7% CIS growth expected for 2022.
An Evolving Mission, Big Consequences
Cédric Malaquin, Team Lead Analyst of the RF activity within the Power & Wireless Division at Yole Intelligence, stated “At present, radars are used as intelligent sensors with a processing capability to output a classified object list, though it is limited in the number of targets. This approach enables basic ADAS functionalities such as AEB and ACC to be deployed. As the use-cases are growing in complexity (think about Automatic Lane Change (ALC)), as well as the car rating scenarios, the mission for radar sensors is evolving”.
This is no longer about providing range and velocity for a small number of objects. Radar sensors are evolving to literally perceive the scene around the car. The goal is to get a free space mapping by radar only, for obvious reasons of redundancy. With such a sensor, OEMs will have access to path-planning at any time, in any driving scenario. Centralization seems an obvious choice to bridge the gap, as it resonates with resource optimization. But it’s also a massive change in architecture, raising multiple points such as the partitioning of radar signal modulation, data processing, data transport, and even data fusion.
Meanwhile, as these problems are clarified, edge processing has room for evolving beyond its current capabilities. In any case, the importance of software in radar sensing is growing and multiple industry players are positioning for either one or the other approach. It will be interesting to track how this industry evolves in the next few years.