Figure 1: Vision cameras, radar, and lidar systems will be key
components for AVs.
No single type of sensor can fulfill all functionality requirements in all weather and light conditions. Therefore, most
companies are pursuing a combination of these sensors to
create a 360-degree view around the vehicle. Lidar uses light
in the form of a pulsed laser to measure ranges with extreme
accuracy. Radar is used for the detection and tracking of
objects—for example, understanding the velocity and direction
of the car, as well as the range to angle of objects. And while
radar doesn’t provide the granularity of lidar, it works well in
adverse weather conditions.
Figure 2: Thermal simulation of a rotating lidar mounted on a
vehicle. The vehicle is moving at 10 m/s and the ambient temperature is 25°C.
Design goals for developing automotive lidar systems, radars,
and cameras largely center on reducing size and cost, without
sacrificing resolution and range requirements necessary to
support various levels of vehicle autonomy. Additionally, when
integrated, these sensors must function reliably in an automotive
Challenges in Sensor Modeling
Self-driving cars will require a broad spectrum of sensors to
serve as their eyes and ears. Today, a regular vehicle (
non-au-tonomous, with perhaps a handful of driver-assist features) has
anywhere from 60 to 100 onboard sensors, and that number will
only increase as cars get “smarter.” Perhaps that is why BCC
Research predicts that the global market for automotive sensors
will reach $43.3 billion by 2021, up from $26.3 billion in 2016.
Among these sensors, three types stand out as must-haves for
autonomous driving—lidar (light detection and ranging), radar,
and image cameras (Figure 1).