Imaging Radar: On...

  • 2022-09-23 10:10:12

Imaging Radar: One Sensor Controls All

There is still some confusion in the industry about the different roles of the three main sensors (camera, radar, and LIDAR) in automobiles, and how they each meet the sensing needs of advanced driver assistance systems (ADAS) and autonomous driving.


Recently, I had an interesting discussion with a friend of mine who knew I was working on TI millimeter wave (mmWave) sensors for radar in ADAS systems and autonomous vehicles (AVs).


Every time he reads an article about how self-driving cars behave in different driving situations (such as obstacle detection), he takes the time to make fun of me. One of the conversations is as follows:


Matt: "If that car had LIDAR on it, it would have been easy to identify objects in the middle of the lane."


Me: "I still don't agree with that."


Matt: "What?! Why don't you agree? That car has a camera sensor and a radar sensor, but the ADAS system still completely fails to sense the car in the middle of the lane."


Me: "When reading about these recent events, you notice that if the camera is constantly exposed to harsh light and other factors, it can prevent it from seeing objects on the road. They are very sensitive to high-contrast light and low visibility conditions, such as heavy fog, rain and snow. In this case, the radar sensor may indeed identify the target.”


Matt: "Nevertheless, we still encounter different situations that these ADAS and AV systems seem to struggle with. So what's the problem?"


Me: "The ADAS decision-making system seems to rely on the camera as the primary sensor to determine if the target is really there, or if it's a false alarm."


Matt: "Then car radar and cameras can't be trusted. So LIDAR is the only reliable sensor. Am I right?"


Me: "Not exactly. Although LIDAR is not as sensitive to visibility as a camera, it is sensitive to weather conditions such as fog, rain, snow, etc. In addition, LIDAR's cost is also high, which may cause it to initially only be used in relatively high-end 4 Level 5 and Level 5 autonomous vehicles.”


Matt: "That's it! There is no single sensor that makes an autonomous vehicle truly reliable. We have to use a combination of all three, but that also means that autonomous vehicles are going to be very expensive."


Me: "You're only partially right. Level 4 and 5 autonomous vehicles may need three sensors: cameras, LIDAR, and radar to provide a highly reliable and fully autonomous driving experience. However, for more vehicles that require partially autonomous driving and Imaging radars using TI mmWave sensors enable high performance, cost-effectiveness, and widespread adoption of ADAS capabilities for economical Class 2 and 3 vehicles that are already in mass production.”


So, what is imaging radar?


Imaging radar, as I explained to Matt, is a subset of radar, so named because of its high angular resolution, which provides sharp images.


Imaging radar is enabled by a sensor configuration in which multiple low-power TI mmWave sensors are cascaded together and operate synchronously as a unit. It features multiple receive and transmit channels that significantly improve angular resolution and radar range performance. When mmWave sensors are cascaded together, integrated phase shifters can be used to create beamforming, resulting in an extended range of 400 meters. Figure 1 shows the cascaded mmWave sensor and its antenna on the evaluation module.



Figure 1: An imaging radar evaluation module with four cascaded TI mmWave sensors

Millimeter Wave Technology for Imaging Radar


A typical radar sensor has only recently been considered the primary sensor in a vehicle, mainly due to its limited angular resolution performance.


Angular resolution refers to the ability to distinguish objects within the same range and at the same relative velocity.


A common use that highlights the benefits of imaging radar sensors is the ability to identify stationary objects at high resolution. Typical mmWave sensors have high speed, high range resolution performance, and can easily identify and distinguish moving objects, but their ability to identify static objects is very limited.


For example, in order for a sensor to "see" a stopped vehicle in the middle of a lane and distinguish it from a light pole or fence, the sensor needs some angular resolution in both elevation and azimuth.


Figure 2 shows a car trapped in a tunnel with constant smoke coming from the car. The vehicle was parked about 100 meters away and the tunnel height was 3 meters.


Figure 2: Front radar for oncoming vehicles requires high enough angular resolution to distinguish between tunnels and stopped vehicles. Millimeter wave sensors can penetrate any visibility situation, such as smoke.

Figure 3: How mmWave sensors utilize multiple-input multiple-output (MIMO) radars to achieve high elevation resolution.


In order to identify a vehicle in the tunnel shown in Figure 2, sensors need to distinguish it from the tunnel roof and walls.


Achieving scene classification takes advantage of these elevation and azimuth resolutions:


(Elevation angle) = arctangent (2 m/100 m) = 1.14 degrees


(Elevation angle) = arctangent (3.5 m/100 m) = 2 degrees


where 2 m is the height of the tunnel minus the height of the vehicle, 100 m is the distance between the oncoming vehicle with imaging radar and the vehicle parked in the tunnel, and 3.5 m is the distance between the vehicle parked in the tunnel and the tunnel wall distance.


Relying on other optical sensors can be challenging in certain weather and visibility conditions. Smoke, fog, inclement weather, and chiaroscuro are all challenging visibility situations that inhibit optical passive and active sensors, such as cameras and LIDAR, so that these sensors may fail to recognize objects. However, TI mmWave sensors maintain strong performance in poor weather and visibility conditions.


Imaging radar sensors are currently the only sensors that maintain robust performance in all weather and visibility conditions, with angular resolution of 1 degree in both azimuth and elevation (or even lower when using super-resolution algorithms to calculate values ).


in conclusion

Imaging radars with TI mmWave sensors are highly flexible, capable of sensing and classifying objects in the near field with very high resolution, while tracking targets in the far field beyond 400 meters. This cost-effective high-resolution imaging radar system enables Level 2 and Level 3 ADAS applications as well as high-end Level 4 and Level 5 autonomous vehicles, and can be used as the primary sensor in the vehicle.