Innovations in sensing technology that won’t break the bank promise to help robotically piloted vehicles understand their surroundings.
Leland Teschler | Executive Editor
Look at one of the prototype autonomous vehicles cruising the highways and you may see a spinning cylinder perched atop the roof. This cylinder houses a light distancing and ranging (lidar) sensor. Initially, lidar sensors were used to generate digital maps for navigation software. Now they are a critical part of the plan for how future generations of autonomous vehicles will sense what’s happening around them to prevent collisions.
Lidar units basically bounce a laser beam off a target and use the return time to measure distance. This time-of-flight (TOF) measurement can resolve the dimensions of objects as far as 300 m away to within a few centimeters, and they can provide this information in milliseconds.
One problem: Lidar units have historically been expensive. For example, consider the Lidar units that sat atop entries in the 2005 Darpa Grand Challenge, the event credited with launching autonomous vehicle technology. Developed by industry pioneer Velodyne LiDAR, their cost was in the $10,000 range largely because they used a precision rotating platform to move their laser beams and detectors across the surroundings.
Clearly, this kind of price point won’t work for mass-produced autonomous vehicles. So today, manufacturers are devising lidar units that are more economical than those guiding the Darpa Challenge cars. Lidar designs targeting mass-produced autonomous vehicles have been announced costing a few hundred dollars or less.
One way manufacturers are reducing lidar cost is by eliminating the need for a precision spinning platform as found in Darpa Challenge lidars. The spinning optical platform approach has the advantage that it can create a 360° field of view (FOV). Lidars employing spinning platforms are still in use but now tend to mainly show up in aerial mapping or in inspecting vegetation as carried out by quadcopter drones.
But many lidar units now on the drawing boards for automotive use don’t employ a 360° FOV. Instead, they look out in one direction with an FOV of perhaps 100° horizontally. To get a 360° coverage area, autonomous cars will likely carry three or four of these limited-view lidar units.
There are several ways of realizing limited FOV lidar. One technique, sometimes dubbed flash lidar, captures an entire scene with a single array of laser diodes. It generally uses a 3D array of pixels, analogous to that in ordinary digital cameras but with the additional ability to record a 3D depth and intensity. Each pixel records the time the laser pulse takes to bounce back to the sensor. So each pixel records the depth and location as well as reflective intensity of the reflection it sees. A high-speed processor calculates the physical range of the objects in front of the camera. In lidar parlance, the resulting information is called a 3D point cloud frame and it is generated at video rates, generally up to 60 frames/sec.
Another technique for generating limited FOV lidar uses a MEMS micromirror to steer a laser beam across a scene in the style of a raster pattern. An example of such a system is that devised by LeddarTech. It uses a pulsed laser diode whose beam bounces off a MEMS micromirror which oscillates rapidly on a single axis at a time. The micromirror sends the beam to a diffuser lens which doubles the angle of orientation of the beam and diffuses the laser pulse so it hits targets on the vertical FOV. A photodiode array detects the backscattered light coming from the targets. Meanwhile, the micromirror movement and laser diode pulses are synchronized with the movement of the micromirror in a way that scans the horizontal FOV in multiple lines, raster style. The detector array segments each vertical signal in multiple individual measurements to build a 3D matrix representing the targets in the FOV.
GM buys a lidar maker
Unfortunately, the time-of-flight measurements implemented by traditional lidar units have their share of issues. For example, simple TOF measurements are prone to interference from other signal sources, and the interference worsens with distance because of the weaker signals involved. And the useful range of TOF lidars depends on how well they detect the relatively faint reflected signals. Making the lidar photodetectors more sensitive also makes them more susceptible to interfering signals.
Difficulties inherent in ordinary lidar may be one reason General Motors recently acquired Strobe Inc., a small California startup developing a sub-$100 solid-state lidar for self-driving cars. Strobe’s approach to lidar differs from that of other manufacturers. It produces brief chirps of frequency-modulated (FM) laser light in the style of chirped radar, where the frequency within each chirp varies linearly. Detectors measure the phase and frequency of the echoing chirp. This gives information not only about the distance of targets but also their relative velocity. Moreover, the returns are said to be less susceptible to interference (because interfering signals are generally not modulated) and can be detected with photodetectors that needn’t be super sensitive.
The idea of FM chirp-based lidar isn’t new, but it has depended on factors that include the linewidth limitations of the emitting laser, the range of frequencies within the chirp, the linearity of the frequency change during each chirp, and the reproducibility of individual chirps. Improving one of these factors tends to make the others worse. And FM lidar systems developed to date generally have relied on relatively large laser sources and on a carefully modulated, low-noise local oscillator with the FM provided by a relatively large interferometer. All in all, these setups have been complicated and bulky.
The Strobe lidar gets around the bulkiness problem by using a technique devised by another company called Oewaves, Inc. which was founded by one of Strobe’s principals. Called a “whispering gallery mode” optical resonator (i.e. resonating optical cavity), it reduces the laser’s linewidth via light feedback. The “whispering gallery” refers to a type of wave that can travel around a concave surface. Though the idea originated with sound waves in cathedrals, it can apply to light waves circulating with little attenuation inside tiny glass spheres or toruses.
You can get a general sense of how the Strobe lidar works by reviewing the Oewaves patent. As the patent describes, light from the laser couples into the whispering gallery mode optical resonator and then couples back out as a returning counterpropagating wave that has a frequency equal to that of the optical resonator’s standing wave frequency. This returning wave gets injected into the laser and has the effect of locking it to the resonator frequency. It also reduces variations in the amplitude of the laser light (relative intensity noise, or RIN) which can degrade FM lidar performance.
So far so good. What’s noteworthy about the technique is that it seems to be a way of modulating the optical properties of the whispering gallery mode optical resonator. Frequency modulation of the resonator optical properties is what provides a method for producing highly linear and reproducible optical chirps for the lidar system.
As is often the case with patent applications, several of the technical details in Oewaves’ lidar scheme are only vaguely described. For example, regarding the FM technique, it only says that a transducer (via electrodes, resistive heater, and/or piezoelectric device) can alter an optical property (for example, the refractive index) of the whispering gallery mode optical resonator. In any event, Oewaves says all these components, even the spherical or torus-shaped resonator, can reside on a single substrate.
The patent also says the linewidth of the optical-injection-locked laser can be less than 100 Hz in some cases. This is important because the narrow linewidth helps maintain light frequency reproducibility from one chirp to the next. Such a laser source can provide linear chirps with large bandwidths of 15 GHz or more which can make for lidar able to resolve distances down to less than a centimeter is some cases.
It looks as though we won’t have long to wait to see lidar with these capabilities. Strobe has said it expects to produce its first commercial product next spring.
You may also like:
Filed Under: Commentaries • insights • Technical thinking, Automotive, Sensor Tips, MORE INDUSTRIES
Marta Hall, President Velodyne Lidar Inc. says
Hello!
The paragraph quoted in your article is faulty. The spinning approach to Lidar used today in cars, used for Google cars for 8 years, used by Caterpillar 8 years, will continue far into the future. It is the Lidar of choice for nearly all test fleets for autonomous driving today. The “low cost” TOF flash and other Lidar mentioned in your article have issues to solve before they will SAFELY be used for autonomy. They must be seamlessly stitched together, with errors and disortion. What is the cost of its processing? What is its accuracy with the stitching issues? What is the cost of computation and do you need a computer the size of your car trunk to process it? How expensive is the processing for this? Has it been tested for accuracy? Has it been tested for range? Has even one been on the road to drive a car autonomously?
Compared to spinning Lidar, which is shipping to customers, these unproven concepts are a hope and a dream to make money. The spinning Lidar is patented, otherwise makers of these cheap Lidar would spin. The “cheap” Lidar may prove to be very expensive and not nearly as good. Also, spinning Lidar is coming down in price to meet the demand for not just Autonomy but Advanced Safety, ADAS solutions. If you investigate the subject, you will see spinning Lidar showing up in all Autonomous cars. Other Lidar solutions for driving are far from proved out. They may be suitable for Parking Assist.
“One way manufacturers are reducing lidar cost is by eliminating the need for a precision spinning platform as found in Darpa Challenge lidars. The spinning optical platform approach has the advantage that it can create a 360° field of view (FOV). Lidars employing spinning platforms are still in use but now tend to mainly show up in aerial mapping or in inspecting vegetation as carried out by quadcopter drones.”