

And unlike cameras or LiDARs, it can work under any weather condition, and even see underneath obstacles. It works by emitting electromagnetic (EM) waves that reflect when they meet an obstacle. Let's begin with the basics: RADAR stands for Radio Detection And Ranging. One of the reasons they gave was that the RADAR's performance was so low that it was negatively affecting the Sensor Fusion algorithm. You probably saw the news saying how Tesla removed their RADAR from their car and are now relying entirely on cameras and algorithms such as Occupancy Networks or HydraNets. Now let's see the RADAR (Radio Detection and ranging): But the drawback is that if you want to measure a velocity, you need to compute the difference between 2 consecutive timestamps. Many Computer Vision systems are indeed trained using ToF LiDAR labels. The advantages of these is in the accuracy of the measurements, we have a laser-like accuracy. Here is a comparison below: 2D vs 3D LiDAR Point Clouds Here is an example: How a TOF LiDAR worksĭepending on the number of vertical layers your LiDAR has, you can have a 2D or a 3D LiDAR, that creates a point cloud of the environment.

By measuring the time it takes to come back, we use the "time of flight" lidar principle to measure the distance of any object. These are sensors that send a light pulse to the environment and measure the time it takes for a wave to come back. In the 2010 decade, when we talked about LiDARs, what people meant was TOF LiDAR systems (Time Of Flight LiDAR). The LiDAR pre-2020įirst, the LiDAR (Light Detection and ranging). I have to say that nothing crazy happened in 2020 (at least on this topic), but it's a new decade so let's use it as a reference. Let's begin with the LiDAR: LiDAR and RADAR sensors before 2020 In this article, I want to describe the evolution of LiDAR and RADAR systems, and share with you some of the implications for the autonomous driving field. This mainly due to 2 emerging technologies: The " FMCW LiDAR" or "4D LiDAR", and the " Imaging RADAR", or "4D RADAR". If the LiDAR used to be weak in velocity estimation, and if the RADAR used to be very noisy, it's no longer the case. In the recent few years, these sensors have evolved. In most articles, courses, or content related to autonomous tech sensors, the same answer has been given: Use all 3 sensors! Except for Elon Musk who really doesn't want anything else than the camera in a Tesla, the answer has been universally approved.īut the world has changed. "These sensors are complementary" I would reply - "The LiDAR is the most accurate sensor to detect a distance, the camera is mandatory for scene understanding, and the RADAR can see through objects and allows for direct velocity measurement". Their question? Camera vs LiDAR vs RADAR, which set of sensors to pick?Īt the time (2020), what made the most sense was to combine all 3.

A few years ago, I was freelancing for a company who asked me if I could help them decide on their sensor suite for their last mile delivery shuttle.
