Lidar: The smartest sensor on a Self Driving Car
2014 Detroit Auto Show. Copyright MotorTrend.

Lidar: The smartest sensor on a Self Driving Car

Several companies are racing to be first in getting a fully autonomous self-driving car to market. While their goals are the same, the approach they are taking have significant differences. Level-3 autonomy capabilities (conditional automation) are pretty common now. Level-3 cars still require a safety driver (a human) to be present in the car and vigilant while it performs some autonomy functions like steering, accelerating, and braking. The key capability that is required to move from level-3 to level-5 (full automation) is the ability to monitor the driving environment and safely react to changes in such a way that the self-driving car is safer than when a human is driving it. This is a pretty big deal! The key metric here is safety. Self-driving cars have to be a lot more safer than the average human driver. Safety is priority #1.

One key difference is whether lidar sensors are needed or not. Self-driving car prototypes from Google, Uber, Ford, and several others make heavy use of lidar sensors. Tesla on the other hand has claimed that lidar sensors don't make sense in a car context and as such may not be needed. A combination of camera, radar, and sonar sensors would suffice. This is pretty controversial, because if this claim was true, then the gen-2 hardware shipping with today's Tesla Model S (and Model X) is sufficient for Tesla to produce a fully autonomous car. This would put Tesla at the forefront. Autonomous capabilities can be gradually released via over-the-air updates to autopilot.

In this post

In this post, I'll go over what lidar is, how it works, and how it contributes to providing self-driving capabilities of a car. I'll also cover, the reasons for the debate over whether it is needed or not. Some parts are a bit technical. Don't worry if your high school physics is a bit fuzzy, I'll keep the treatment light weight and brief so that you can quickly get through them without missing much.

What is Lidar?

Lidar stands for Light Detection and Ranging (LIDAR). A lidar device uses a pulsed laser beam to determines the distance and speed of nearby objects. A pulsed laser beam differs from a normal laser in that it sends out pulses of laser light. The laser pulses emitted by the lidar device reflect (and scatter) off of objects and return back to the device. The differences in return times and shift in wavelength are used to determine both the distance and speed of the object being scanned. If you drive and have ever gotten a speeding ticket, you are likely already aware of the use of radar and laser (lidar) devices by traffic police (see Figures 1(a) and 1(b)).

Capabilities of Lidar sensors

Lidar is very versatile and can be used to observe and understand a variety of objects: metallic, non-metallic, rocks, rain, and clouds. It also can produce super high resolutions like one foot resolution on the ground from an aircraft that is miles above the ground. For example, one of the first uses of lidar was for mapping the surface of the moon from the Apollo 15 mission in 1971.

 Police lidars use the infra red range. Well known limitations of police lidars include the Sun being directly behind the car being scanned. Also, heavy fog renders it useless. When the object isn't directly moving towards the lidar device, the speed estimates need to correct for the angle of travel (cosine error effect). For example, an object that is moving in a circle with the lidar device at the center will be at the same distance and thus cause the lidar to read it as being stationary.

Figure 1(a). A Texas police officer operating a hand held Lidar gun. (by Loadmaster (David R. Tribble), CC BY-SA 3.0)

Figure 1(b). An Army Sargent using a radar gun (by Master Sgt. Lek Mateo, Public Domain)

In self-driving cars, lidar sensors are used in conjunction with radar and sonar sensors. The word lidar itself was coined as a combination of the words light and radar as it is akin to a radar that uses laser light instead of radio waves. Radar stands for RAdio Detection And Ranging (RADAR). Radar operates similarly, but uses radio waves instead of lasers. Sonar works by emitting sound waves and listening for the echoes. Bats and Dolphins are known for using sound waves in a similar manner for echolocation. Sonar sensors are used for very short range sensing of objects like the curb of the road especially while auto parking.

So, what kind of Lidar's are used in self-driving cars?

Lidar sensors have been part of self-driving designs right from the beginning. For example, in 2007, five out of six finishing teams in the DARPA Urban Challenge used lidar sensors. There was even an entry that used lidar alone. Most of these used Velodyne's HDL-64E lidar sensor, which costs around $75,000 (see figure 2). More recently, Velodyne's spinning HDL-32E has also become quite common and gained popularity during 2012-2015. The HDL-32E costs around $30,000. Both of these models use a spinning ball that shoots out a bank of lasers up to 300 feet. Their spin produces a 360 degree view horizontally. The HDL-64E uses 64 lasers, while the HDL-32E uses only 32 lasers. Each laser produces an independent channel of range data. Together, they produce a pretty decent vertical field of view, which for the HDL-64E is about 30 degrees. The cost of the device is pretty much directly proportional to the number of lasers used. These devices use the near infra red spectrum which is beyond the visual range and as such cannot be seen by the human eye. Their power levels are also regulated to be safe from causing any eye damage (Class-1 Lasers).

Figure 2. The three common lidar sensors made by Velodyne: HDL-64, HDL-32, and VLP-16.

How does the sensor data generated by lidar look?

Each spinning lidar sensor produces a 360 degree point cloud that surrounds the self-driving car. The below figures (Figure 3) show how the 3-D point cloud comprising 64 lines of data produced by an HDL-64E sensor mounted on the top of a car. The HDL-64E sensor produces 2.2 Million points per second of data and has a vertical field of view of 27 degrees. Sixty four scan lines are sufficient to not only detect the presence of nearby object, but also easily discern the shapes of cars, trees, humans, etc. In both figures, you can see the circular dead zone in the center (shows up in black). You'll also notice that windows do not reflect or scatter light back to the lidar and as a result are not sensed (appear black). Similar to cameras that rely on visible light, lidar data also contains shadows produced by nearby objects (regions of missing data). This results in the occlusion of objects. The range for these lidar systems is about 300 feet, beyond which the returned laser scatter signal is too weak to sense. This produces a horizon beyond which no data is returned. Overall, you can clearly see that you need more than just lidar data to be able to get a complete view of nearby objects.

Figure 3(a). A rendering of the 3-D point cloud generated by Velodyne lidar HDL-64E system. The "64" in the name refers to the number of spinning lasers, each of which produces a scan line of range data shown here as contour lines. You can clearly see three parked cars and a barely discernible fourth car [copyright: Velodyne].

Figure 3(b). A rendering of the 3-D point cloud generated by Velodyne lidar HDL-64E system. In this figure, you can clearly see the human with their arms spread out. The resolution is high enough to even make out the contours of their head, torso, arms, and legs [copyright: Velodyne].

In addition to nearby objects that block laser light, fog, rain, and hail also cause lidar blind spots. In these weather conditions, the moisture in the air blocks out all laser light and renders lidar ineffective. Radar does better in foggy conditions. For example, airplanes and jets use radar to see through even moderate to dense clouds, rain, and hail.

The cost of Lidar sensors

The biggest drawback of lidar sensors is their price. Cost has been the primary reason for the debate over whether Lidar technology is needed or not to make self-driving cars a reality. Seventy five thousand dollars for a HDL-64E system or even thirty thousand dollars for an HDL-32E is too expensive for inclusion in a mass produced self-driving car. The longevity of these sensors is also affected by their mechanical components, especially the apparatus that continuously spins them to produce a 360 degree view. Both of these issues are being actively worked on and are expected to be solved soon.

For example, Velodyne's latest sensor, the VLP-16 Solid State Hybrid Ultra Lidar Puck has no rotating parts, costs only $8,000, can still sense objects up to 300 feet away, and comes in a much more compact form factor (see Figure 2). The price drop comes mainly from reducing the number of laser sensors down to 16 (from 64 in the HDL-64E). As a result, the number of points generated per second drops to 300k/sec (from 2.2M for HDL-64E). There is no impact to the range (300 feet) and field of view (still about 30 degrees). Though it has lower resolution, the 10x drop in price dramatically increases the affordability of self-driving cars. It is cheap enough that some of the test setups seem to be experimenting with two or four of these Velodyne puck sensors.

Who makes Lidar sensors?

While I focused mostly on Velodyne's sensors in this post, there are other providers of Lidar sensors for self-driving cars. Three notable ones are Delphi, ZF, and Infinion. Delphi has partnered with Quanergy to start producing low-cost Lidar systems. Quanergy already produces a Lidar sensor, Quanergy M8, that has 8 laser sensors and produces a 360 degree view with 420k points/second. Similarly, ZF has partnered with Ibeo Automotive to include lidar to its suite of products. Ibeo produces its own suite of lidar devices. For example, the Ibeo Alasca XT, has 4 lasers and is capable of sensing objects as far as 600 feet away.

Figure 4(a). Quanergy's M8 lidar sensor with 8 lasers (copyright: Quanergy).

Figure 4(b). Ibeo Automotive's Alasca XT lidar sensor with 4 lasers (copyright: Ibeo Automotive).

Unlike the Velodyne puck with reduced number of scan lines and static construction, Infinion has opted to use a micro-electro mechanical system (MEMS) from Innoluce in its lidar products. The device consists of a tiny oval-shaped mirror which is just 3mm by 4mm and is housed on a bed of silicon. Actuators powered by electrical resonance oscillate the mirror from side to side and reflect a laser beam to scan objects. The reflection based design conserves power better than the flash based approach. It has a low scan rate of 5k points/second with a range of 800 feet. However, it is tiny, made from solid state chip designed to be mass produced, and is expected to cost between $100 to $250.

Ford and Baidu recently invested $150M in Velodyne to accelerate their roadmap for solid state Lidar devices. With multiple providers and increased investment in improving lidar sensor technology, the future looks bright for self driving cars that rely on lidar sensors.

Safety Cocoon

As we march towards a self driving car capable of level-5 autonomy, safety is the #1 concern. Even after we achieve that, safety will continue to be a top concern for decades to come. I expect the number and variety of sensors onboard a self-driving car to only increase with time. Continuous innovation in the hardware space will make them smaller and cheaper. While today, the debate maybe on whether to include a Lidar sensor or not, cars of the future will have multiple miniature lidars, radars, sonars, and digital cameras. All of these together will generate a tight "safety cocoon" around the self driving car that will make it a very safe mode of transportation.

References

Sridharan (Sri) S.

Independent Trainer/Consultant

5y

Saw this in action over at Mountain View yesterday. Pretty impressive.

Like
Reply
Anshul Jain

Executive Technology Leader Specialized in Developing Hardware & Software Products

6y

Kumar, what are your thoughts on current LIDAR performance for larger distances and higher vehicle speeds ( highway driving conditions). Velodyne lidar HDL-64E sends 4000 pulses in one revolution @10Hz i.e one revolution is .1 sec. This may be ok for slower speeds at shorter distances but is this enough speed and resolution for highway high speed environment ?

Like
Reply
Kumar Chellapilla

I'm hiring. Head of ML Foundations and Generative AI.

7y

Thanks for the note Francois. The article is meant for folks who are new to Lidar and it's role in self-driving cars. You already work on Lidars. So, it is understandable that your needs are around more depth with advanced aspects.

Like
Reply
Francois-Laurent Renet

Branding, Marketing, Partnerships, PR & Events are my passion, Networking and Connecting is my oxygen. Building a better and more sustainable future is everything!

7y

Basic article which barely scratches the surface of current leaders in the LiDAR market. There is so much more to LiDAR and why it should be used in self driving cars! Chris Urmson at Aurora has some great articles and youtube talks.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics