An Introduction to LIDAR: The Key Self-Driving Car Sensor

We ❤️ LIDAR at Voyage

Oliver Cameron
Voyage
10 min readMay 9, 2017

At Voyage we recently shared the news of Homer, our first self-driving taxi. Homer is outfitted with a whole range of sensors to aid in understanding and navigating the world, key to which is LIDAR (short for light detection and ranging). In this post you’ll learn more about LIDAR, its origins in the self-driving car space, and how it stacks against other sensors. Enjoy!

Super Powers

LIDAR enables a self-driving car (or any robot) to observe the world with a few special super powers:

  • Continuous 360 degrees of visibility – Imagine if your human eyes allowed you to see in all directions all of the time
  • Insanely accurate depth information – Imagine if, instead of guessing, you could always know the precise distance (to an accuracy of ±2cm) of objects in relation to you

If you’ve seen a self-driving car before, you’ve probably seen a LIDAR sensor. It’s typically the bulky box mounted on the roof that spins continuously, as seen below on Uber and Baidu self-driving cars.

One of the most popular LIDAR sensors on the market is the high-powered Velodyne HDL-64E, as seen below mounted on Homer.

How Does LIDAR Work?

How does a sensor that has 360 degree vision and accurate depth information work? Simply put: a LIDAR sensor continually fires off beams of laser light, and then measures how long it takes for the light to return to the sensor.

By firing off millions of beams of light per second, the measurements from the LIDAR sensor enable a visualization of the world that is truly 3D. You can infer the exact measurement of any object around you (up to around ~60m, depending on the sensor)

A Brief History of LIDAR

To understand why there’s so much support behind LIDAR today, it’s important to look at other similar technologies which have similar goals.

Sonar

The original depth-sensing robot was the humble Bat (50 million years old!). A bat (or dolphin, among others) is able to perform some of the same capabilities as LIDAR using echolocation, otherwise known as Sonar (sound navigation and ranging). Instead of measuring light beams like LIDAR, Sonar measures distance using sound waves.

After 50 million years of biological exclusivity, World War 1 advanced the timeline of the first major deployment of man-made Sonar sensors, with the advent of submarine warfare. Sonar works excellently in water, where sound travels far better than light or radio waves (more on that in a second). Sonar sensors are in active use on cars today, primarily in the form of parking sensors. These short-range (~5m) sensors enable a cheap way to know just how far that wall is behind your car. Sonar hasn’t been proven to work at the kinds of ranges a self-driving car demands (60m+).

In this instance, the Bat is the sender/receiver

Radar

Radar (radio direction and ranging), much like Sonar, was another technology developed during an infamous World War (WW2, this time). Instead of using light or sound waves, it instead utilizes radio waves to measure distance. We make use of a lot of Radar (using Delphi sensors) on Homer, and it’s a tried-and-tested method that can accurately detect and track objects as far as 200m away.

Radar has very little in terms of downside. It performs well in extreme weather conditions and is available at an affordable pricepoint. Radar is heavily used not only for detection of objects, but tracking them too (ex: understanding how fast a car is going and in which direction). Radar doesn’t necessarily give you granularity of LIDAR, but Radar and LIDAR are very complimentary, and it’s definitely not either/or.

A radar installed on Homer

LIDAR

LIDAR was born in the 1960s, just after the advent of the laser. During the Apollo 15 mission in 1971, astronauts mapped the surface of the moon, giving the public the first glimpse of what LIDAR could do.

Clementine

Before LIDAR was even considered for automotive and self-driving use, one of the popular use-cases of LIDAR was archeology. LIDAR provides a ton of value for mapping large-scale swaths of land, and both archeology and agriculture benefitted tremendously from it.

An aerially-captured LIDAR map

“When Lidar was first used at Angamuco we had no idea how large the area was that included buildings and structures, if it was even a city,” team member Professor Steve Leisz told the BBC. Perhaps more surprisingly the team also found a ball court for a Meso American game called pok-ta-pok, and pyramids, including one that Fisher had walked within 10m of the previous year. “That was a complete surprise,” said Leisz. — Lidar archaeology shines a light on hidden sites

It wasn’t until the 2000s when LIDAR was first utilized on cars, where it was made famous by Stanley (and later, Junior) in the 2005 Grand DARPA Challenge.

Stanley (left) utilized SICK LIDAR sensors, Junior (right) used Velodyne sensors

Stanley, the winner of the 2005 Grand DARPA Challenge, made use of 5 SICK LIDAR sensors mounted on the roof, in addition to a military-grade GPS, gyroscopes, accelerometers and a forward-facing camera looking out 80m+. All of this was powered by six 1.6GHz Pentium Linux PCs sitting in the trunk.

The fundamental challenge with the SICK LIDARs (which powered a significant portion of the 2005 challenge vehicles) is that each laser scan is essentially a cut made by a single plane, and so you had to be methodical in how you pointed them. Many teams mounted them on tilting stages, in order to use them to “sweep” a segment of space. In simple terms: SICK was a 2D LIDAR (a few beams of light in one direction) vs. the modern 3D LIDARs (tons of beams of light in all directions) we know today.

Enter Velodyne

Velodyne has long been the market leader in LIDAR, however they didn’t start out life that way. Velodyne began life as an audio company in 1983, specializing in low-frequency sound and subwoofer technology. The subwoofers contained custom sensors, DSPs and custom DSP control algorithms. Velodyne became the LIDAR company we know today at the same time as Stanley’s debut. Velodyne founders David and Bruce Hall first entered the 2004 DARPA competition as Team DAD (Digital Audio Drive). For the second race In 2005, David Hall invented and patented the 3D laser-based real-time system that laid the foundation for Velodyne’s current LIDAR products today. By the 3rd DARPA challenge in 2007, the majority of teams used this technology as the basis of their perception system. David Hall’s invention is now in the Smithsonian as a foundational breakthrough enabling autonomous driving.

Team DAD in 2005

The first Velodyne LIDAR scanner was about 30 inches in diameter and weighed close to 100 pounds. Choosing to commercialize the LIDAR scanner instead of competing in subsequent challenge events, Velodyne was able to dramatically reduce the sensor’s size and weight while also improving performance. Velodyne’s HDL-64E LIDAR sensor was the primary means of terrain map construction and obstacle detection for all the top DARPA Urban Challenge teams in 2007 and used by five out of six of the finishing teams, including the winning and second-place teams. Some teams relied exclusively on the LIDAR for the information about the environment used to navigate an autonomous vehicle through a simulated urban environment.Wikipedia

Traction of LIDAR in Self-Driving Cars

Why did LIDAR take off with self-driving cars? In a word: mapping. LIDAR allows you to generate huge 3D maps (its original application!), which you can then navigate the car or robot predictably within. By using a LIDAR to map and navigate an environment, you can know ahead of time the bounds of a lane, or that there is a stop sign or traffic light 500m ahead. This kind of predictability is exactly what a technology like self-driving cars requires, and has been a big reason for the progress over the last 5 years.

Object Detection

As LIDARs have become higher-resolution and operate at longer ranges, a new use-case has emerged in object detection and tracking. Not only can a LIDAR map enable you to know precisely where you are in the world and help you navigate it, but it can also detect and track obstacles like cars, pedestrians and according to Waymo, football helmets.

Modern LIDAR enables you to differentiate between a person on a bike or a person walking, and even at what speed and which direction they are going in.

A Google car

The combination of amazing navigation, predictability and high-resolution object tracking has meant that LIDAR is the key sensor in self-driving cars today, and it’s hard to see that domination changing. Unless…

Camera-Powered Cars

There’s a number of startups out there approaching the problem of self-driving cars using purely cameras (and perhaps radar), with no LIDAR in sight. Tesla is the biggest company of the bunch, and Elon Musk has repeatedly pushed the idea that if humans can perceive and navigate the world using just eyes, ears and a brain, then why can’t a car? I’m certain that this approach will achieve amazing results, especially as other talented teams work toward this goal, including Comma and AutoX.

It’s important to note that Tesla has an interesting constraint that may have factored in to their decision: scale. Tesla hopes to ship 500k cars a year very soon, and can’t wait for LIDAR to come down in cost (or be manufactured in volume) tomorrow, it needed to happen yesterday!

The Future of LIDAR

The industry is marching ahead with a real focus on: cost decrease and resolution and range increase.

Cost Decrease

Solid-state LIDAR opens up the potential of sub-$1k powerful LIDAR units, which today can cost as much as $80k a unit. LeddarTech are one of the leaders in this early market.

Here’s what Velodyne has to say about solid-state:

Solid state, fixed sensors are driven by the idea that you want an embeddable sensor with the smallest size at the lowest possible cost. Naturally, that also means that you have a smaller field of view. Velodyne supports both fixed and surround view sensors. The fixed sensors are miniaturized to be embedded. From a cost standpoint, both contain lenses, lasers and detectors. The lowest cost system is actually via surround view sensors because rotation reuses the lens, lasers and detectors across the field of view, versus using additional sensors each containing individual lenses, lasers and detectors. This reuse is both the most economical, as well as the most powerful, as it reduces the error associated with merging different points of view in real-time — something that really counts when the vehicle is moving at speed.

Resolution and Range Increase

The huge jump in the number of applications for LIDAR has brought with it a flood of talented founders and teams starting companies in the space. Higher resolution output and increased tracking range (200m in some cases) will provide better object recognition and tracking, and are one of the key differentiators in sensors from startups like Luminar.

At Voyage, we’ve placed a bet on LIDAR. We love all the benefits that it brings, and believe the ecosystem will take care of bringing down the cost just in time for when we need to scale our autonomous taxi service. If you’re a LIDAR startup and want to test your sensors, we’d love to be one of your first customers. Reach out on our website!

I’ve started a weekly email newsletter covering all things deep learning and self-driving cars: I call it Transmission, sign up today!

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Published in Voyage

Voyage is delivering on the promise of self-driving cars. Voyage has built the technology and services to bring robotaxis to those who need it most, beginning in retirement communities.

Written by Oliver Cameron

Obsessed with AI. Built self-driving cars at Cruise and Voyage. Board member at Skyways. Y Combinator alum. Angel investor in 50+ AI startups.

Responses (19)