LiDAR for Autonomous System Design

Publisher:少年不识愁滋味Latest update time:2023-05-25 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

The promise of a fully autonomous tomorrow no longer seems like a pipe dream. Today, questions surrounding autonomy focus on the underlying technology and the advances needed to make autonomy a reality. Light Detection and Ranging (LIDAR) has become one of the most discussed technologies supporting the shift toward autonomous applications, but many questions remain. LiDAR systems with ranges greater than 100 m and 0.1° angular resolution continue to dominate the headlines for autonomous driving technology.


However, not all autonomous applications require this level of performance. Applications such as valet parking assistance and street sweeping are two such examples. There are many depth sensing technologies that support these applications, such as radio detection and ranging (radar), stereo vision, ultrasonic detection and ranging, and lidar. However, each of these sensors offers unique trade-offs between performance, form factor, and cost. Ultrasonic devices are the most affordable but are limited in range, resolution, and reliability. Radar has greatly improved its range and reliability, but it also has angular resolution limitations, while stereo vision can have significant computational overhead and accuracy limitations if not properly calibrated. Thoughtful LIDAR system design helps bridge these gaps with accurate depth sensing, fine angular resolution, and low-complexity processing, even at long ranges. However, lidar systems are often perceived as bulky and costly, but this is not necessarily the case.


LiDAR system design begins with identifying the smallest object the system needs to detect, the reflectivity of that object, and the distance at which that object can be located. This will define the angular resolution of the system. From this, the minimum achievable signal-to-noise ratio (SNR) can be calculated, which is the true/false positive or negative detection standard required to detect an object.


Understanding the perception context and the amount of information required to make the appropriate design tradeoffs allows for the development of an optimal solution relative to cost and performance. For example, consider a self-driving car traveling on a road at 100 km/h (~27 mph) versus an autonomous robot driving around a pedestrian space or warehouse at 6 km/h. In the high-speed case, not only is there a vehicle traveling at 100 kph to consider, but there is also another vehicle traveling in the opposite direction at the same speed. For the perception system, this is equivalent to an object approaching at a relative speed of 200 km/h. For a LIDAR sensor that detects objects at a maximum distance of 200 m, the vehicle will close the distance between them by 25% in just one second. It should be emphasized that the speed of the vehicle (or the nonlinear closing speed of the object), the stopping distance, and the dynamics involved in performing an evasive maneuver are complexities unique to each situation. In general, it can be said that higher-speed applications require longer-range LIDAR systems.


Resolution is another important system characteristic for LiDAR system design. Fine angular resolution enables a LiDAR system to receive multiple pixels of return signals from a single object. As shown in Figure 1, an angular resolution of 1° translates to pixels of 3.5 m on each side at a range of 200 m. Pixels of this size are larger than many objects that need to be detected, which presents some challenges. On the one hand, spatial averaging is often used to improve SNR and detectability, but with only one pixel per object, this is not an option. In addition, even if detected, the size of the object cannot be predicted. A piece of road debris, an animal, a traffic sign, and a motorcycle are typically smaller than 3.5 m. In contrast, a system with 0.1° angular resolution has 10 times smaller pixels and should measure the average width of approximately five adjacent returns on a car at a distance of 200 m. Most cars are typically wider than they are tall. Therefore, the system may be unable to distinguish between a car and a motorcycle.


Detecting whether an object is safe to drive over requires finer resolution in elevation than in azimuth. Now imagine how different the requirements might be for an autonomous vacuum robot, since it moves slowly and needs to detect narrow, tall objects, such as table legs.


Once the driving distance and speed are defined, and the goals and subsequent performance requirements are determined, the architecture of the LIDAR system design can be determined (see the example LIDAR system in Figure 2). There are several options to choose from, such as scanning versus flash, or direct time-of-flight (ToF) versus waveform digitization, but their trade-offs are beyond the scope of this article.

pYYBAGLKKOiAJoXWAAIu9Qz8CUQ475.png

(Figure 1. A lidar system with 32 vertical channels scans the environment horizontally with an angular resolution of 1°.)

poYBAGLKKO2AUeRPAAFVT9iYcS8284.png

(Figure 2. Discrete components of a lidar system.)

pYYBAGLKKPSAMTs7AAEee8OEWBA254.png

(Figure 3. Analog Devices AD-FMCLIDAR1-EBZ LIDAR development solution system architecture.)

Range or depth accuracy is related to the ADC sampling rate. Range accuracy enables the system to accurately know the distance of an object, which is critical in use cases that require close range movement, such as parking or warehouse logistics. In addition, the change in range over time can be used to calculate velocity, and this use case generally requires higher range accuracy. Using a simple thresholding algorithm such as direct ToF, the achievable distance accuracy for a 1 ns sampling period (i.e. using a 1 GSPS ADC) is 15 cm. This is calculated as c(dt/2), where c is the speed of light and dt is the ADC sampling period. However, given the inclusion of an ADC, more complex techniques such as interpolation can be used to improve range accuracy. As an estimate, distance accuracy can be improved by roughly the square root of the SNR. One of the highest performance algorithms for processing the data is a matched filter, which maximizes the SNR and then interpolates to produce the best distance accuracy.


The AD-FMCLIDAR1-EBZ is a high-performance LIDAR prototyping platform, a 905 nm pulsed direct ToF LIDAR development kit. The system enables rapid prototyping for robots, drones, agricultural and construction equipment, and ADAS/AV with a 1D static flash configuration. The components selected for this reference design target long-range pulsed LIDAR applications. The system is designed with a 905 nm laser source driven by the high-speed dual 4 A MOSFET ADP3634. It also includes a First Sensor 16-channel APD array powered by a programmable power supply LT8331 to generate the APD supply voltage. There are multiple 4-channel LTC6561TIA with low noise and high bandwidth, and the AD9094 1 GSPS, 8-bit ADC, which has a minimum power consumption of 435 mW/channel per channel. There will continue to be a need for increased bandwidth and sampling rate, which can help increase the overall system frame rate and improve range accuracy. At the same time, minimizing power consumption is also important because less heat dissipation simplifies thermal/mechanical design and allows for smaller form factors.


Another tool that aids in lidar system design is the EVAL-ADAL6110-16, a highly configurable evaluation system that provides a simplified yet configurable 2D flash lidar depth sensor for applications requiring real-time (65 Hz) object detection/tracking, such as collision avoidance, altitude monitoring, and soft landing.


The optics used in the reference design achieve a field of view (FOV) of 37° (azimuth) by 5.7° (elevation). Using a 16-pixel linear array in azimuth, the pixel size at 20 m is comparable to that of an average adult, 0.8 m (azimuth) by 2 m (elevation). As mentioned previously, different applications may require different optical configurations. If the existing optics do not meet the application requirements, the printed circuit board can be easily removed from the housing and integrated into the new optical configuration.


The evaluation system is built around ADI's ADAL6110-16, a low power, 16-channel, integrated LIDAR signal processor (LSP). The device provides timing control for illuminating the region of interest, timing for sampling the received waveform, and the ability to digitize the captured waveform. The integration of sensitive analog nodes by the ADAL6110-16 reduces the noise floor, allowing the system to capture very low signal returns, rather than using discrete components with similar design parameters to implement the same signal chain, where rms noise would dominate the design. In addition, the integrated signal chain allows LIDAR system designs to reduce size, weight, and power consumption.


The system software enables fast uptime to take measurements and start using the odometry system. It is completely standalone, powered by a single 5 V supply via USB, and can also be easily integrated into autonomous systems via the provided Robot Operating System (ROS) drivers. Users simply create a connector for the header to connect with their robot or vehicle and communicate via one of the four available communication protocols: SPI, USB, CAN, or RS-232. The reference design can also be modified for different receiver and transmitter technologies.


As mentioned previously, the receiver technology of the EVAL-ADAL6110-16 reference design can be modified to create different configurations, as shown in Figure 5 through Figure 7. The EVAL-ADAL6110-16 features a Hamamatsu S8558 16-element photodiode array. The pixel sizes at different distances shown in Table 1 are based on an effective pixel size of 0.8 mm × 2 mm and a 20 mm focal length lens. For example, if the same board is redesigned using individual photodiodes, such as the Osram SFH-2701, each with an active area of ​​0.6 mm × 0.6 mm, the pixel size at the same range will be very different as the FOV changes. Size of the pixel.

[1] [2]
Reference address:LiDAR for Autonomous System Design

Previous article:Analysis of the design scheme of automobile air conditioning damper mode
Next article:ADAS Development Based on MATLAB and Simulink

Recommended ReadingLatest update time:2024-11-16 20:38

Scantinel Photonics Demonstrates World’s First All-Solid-State Parallel FMCW 5D+ LiDAR System
According to foreign media reports, Scantinel Photonics, the world's leading FMCW LiDAR company, demonstrated the world's first all-solid-state parallel FMCW 5D+ LiDAR system based on photonic integrated circuits (PIC). Light detection and ranging (LiDAR) technology has gained great popularity in various applications
[Automotive Electronics]
Scantinel Photonics Demonstrates World’s First All-Solid-State Parallel FMCW 5D+ LiDAR System
Berkeley uses MEMS technology to achieve ultra-high-resolution solid-state LiDAR
The University of California, Berkeley, has used microelectromechanical systems (MEMS) to achieve a “record-breaking resolution” focal plane switch array (FPSA) laser radar (LiDAR). Most companies pursuing autonomous vehicles (AVs) currently use LiDAR to some degree, and it has become a vital sensor due t
[sensor]
Berkeley uses MEMS technology to achieve ultra-high-resolution solid-state LiDAR
Osram launches new generation of LiDAR infrared laser
On February 3, Osram announced the launch of a new lidar infrared laser that can reduce wavelength deviation to 10 nanometers, making surrounding images clearer. LiDAR is a key technology for self-driving cars, which can be combined with radar and camera systems to act as the car's visual sense to capture informatio
[Automotive Electronics]
Osram launches new generation of LiDAR infrared laser
ROHM develops 120W high-output laser diode "RLD90QZW8" for LiDAR
The temperature effect of wavelength is reduced by 66% , and the measurement distance is greatly extended! ROHM develops 120W high output power laser diode " RLD90QZW8 " for LiDAR Achieve uniform and stable luminous intensity within a line width of 97%, helping to improve the accuracy of LiDA
[Power Management]
ROHM develops 120W high-output laser diode
Aeva Launches Aeries II, World's First 4D LiDAR Sensor with Camera-Grade Resolution
According to foreign media reports, Aeva®, a leader in next-generation sensing and perception systems, announced the launch of the Aeries™ II 4D LiDAR™ sensor with automotive-grade reliability, which can be used in automotive, industrial and other applications. With Aeva's unique frequency modulated continuous wave
[Automotive Electronics]
Aeva Launches Aeries II, World's First 4D LiDAR Sensor with Camera-Grade Resolution
Quanergy will appeal against Velodyne's core patents, and other LiDAR manufacturers should be vigilant
Quanergy Systems, a leading supplier and innovator of LiDAR sensors and smart sensing solutions, recently announced plans to appeal the U.S. Patent Trial and Appeal Board (PTAB) regarding the validity of Velodyne's U.S. patent US7969558. In addition, Quanergy also announced that it is considering using intellectual pr
[Automotive Electronics]
Quanergy will appeal against Velodyne's core patents, and other LiDAR manufacturers should be vigilant
Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号