The promise of a fully autonomous tomorrow no longer seems like a pipe dream. Today, questions surrounding autonomy focus on the underlying technology and the advances needed to make autonomy a reality. Light Detection and Ranging (LIDAR) has become one of the most discussed technologies supporting the shift toward autonomous applications, but many questions remain. LiDAR systems with ranges greater than 100 m and 0.1° angular resolution continue to dominate the headlines for autonomous driving technology.
However, not all autonomous applications require this level of performance. Applications such as valet parking assistance and street sweeping are two such examples. There are many depth sensing technologies that support these applications, such as radio detection and ranging (radar), stereo vision, ultrasonic detection and ranging, and lidar. However, each of these sensors offers unique trade-offs between performance, form factor, and cost. Ultrasonic devices are the most affordable but are limited in range, resolution, and reliability. Radar has greatly improved its range and reliability, but it also has angular resolution limitations, while stereo vision can have significant computational overhead and accuracy limitations if not properly calibrated. Thoughtful LIDAR system design helps bridge these gaps with accurate depth sensing, fine angular resolution, and low-complexity processing, even at long ranges. However, lidar systems are often perceived as bulky and costly, but this is not necessarily the case.
LiDAR system design begins with identifying the smallest object the system needs to detect, the reflectivity of that object, and the distance at which that object can be located. This will define the angular resolution of the system. From this, the minimum achievable signal-to-noise ratio (SNR) can be calculated, which is the true/false positive or negative detection standard required to detect an object.
Understanding the perception context and the amount of information required to make the appropriate design tradeoffs allows for the development of an optimal solution relative to cost and performance. For example, consider a self-driving car traveling on a road at 100 km/h (~27 mph) versus an autonomous robot driving around a pedestrian space or warehouse at 6 km/h. In the high-speed case, not only is there a vehicle traveling at 100 kph to consider, but there is also another vehicle traveling in the opposite direction at the same speed. For the perception system, this is equivalent to an object approaching at a relative speed of 200 km/h. For a LIDAR sensor that detects objects at a maximum distance of 200 m, the vehicle will close the distance between them by 25% in just one second. It should be emphasized that the speed of the vehicle (or the nonlinear closing speed of the object), the stopping distance, and the dynamics involved in performing an evasive maneuver are complexities unique to each situation. In general, it can be said that higher-speed applications require longer-range LIDAR systems.
Resolution is another important system characteristic for LiDAR system design. Fine angular resolution enables a LiDAR system to receive multiple pixels of return signals from a single object. As shown in Figure 1, an angular resolution of 1° translates to pixels of 3.5 m on each side at a range of 200 m. Pixels of this size are larger than many objects that need to be detected, which presents some challenges. On the one hand, spatial averaging is often used to improve SNR and detectability, but with only one pixel per object, this is not an option. In addition, even if detected, the size of the object cannot be predicted. A piece of road debris, an animal, a traffic sign, and a motorcycle are typically smaller than 3.5 m. In contrast, a system with 0.1° angular resolution has 10 times smaller pixels and should measure the average width of approximately five adjacent returns on a car at a distance of 200 m. Most cars are typically wider than they are tall. Therefore, the system may be unable to distinguish between a car and a motorcycle.
Detecting whether an object is safe to drive over requires finer resolution in elevation than in azimuth. Now imagine how different the requirements might be for an autonomous vacuum robot, since it moves slowly and needs to detect narrow, tall objects, such as table legs.
Once the driving distance and speed are defined, and the goals and subsequent performance requirements are determined, the architecture of the LIDAR system design can be determined (see the example LIDAR system in Figure 2). There are several options to choose from, such as scanning versus flash, or direct time-of-flight (ToF) versus waveform digitization, but their trade-offs are beyond the scope of this article.
(Figure 1. A lidar system with 32 vertical channels scans the environment horizontally with an angular resolution of 1°.)
(Figure 2. Discrete components of a lidar system.)
(Figure 3. Analog Devices AD-FMCLIDAR1-EBZ LIDAR development solution system architecture.)
Range or depth accuracy is related to the ADC sampling rate. Range accuracy enables the system to accurately know the distance of an object, which is critical in use cases that require close range movement, such as parking or warehouse logistics. In addition, the change in range over time can be used to calculate velocity, and this use case generally requires higher range accuracy. Using a simple thresholding algorithm such as direct ToF, the achievable distance accuracy for a 1 ns sampling period (i.e. using a 1 GSPS ADC) is 15 cm. This is calculated as c(dt/2), where c is the speed of light and dt is the ADC sampling period. However, given the inclusion of an ADC, more complex techniques such as interpolation can be used to improve range accuracy. As an estimate, distance accuracy can be improved by roughly the square root of the SNR. One of the highest performance algorithms for processing the data is a matched filter, which maximizes the SNR and then interpolates to produce the best distance accuracy.
The AD-FMCLIDAR1-EBZ is a high-performance LIDAR prototyping platform, a 905 nm pulsed direct ToF LIDAR development kit. The system enables rapid prototyping for robots, drones, agricultural and construction equipment, and ADAS/AV with a 1D static flash configuration. The components selected for this reference design target long-range pulsed LIDAR applications. The system is designed with a 905 nm laser source driven by the high-speed dual 4 A MOSFET ADP3634. It also includes a First Sensor 16-channel APD array powered by a programmable power supply LT8331 to generate the APD supply voltage. There are multiple 4-channel LTC6561TIA with low noise and high bandwidth, and the AD9094 1 GSPS, 8-bit ADC, which has a minimum power consumption of 435 mW/channel per channel. There will continue to be a need for increased bandwidth and sampling rate, which can help increase the overall system frame rate and improve range accuracy. At the same time, minimizing power consumption is also important because less heat dissipation simplifies thermal/mechanical design and allows for smaller form factors.
Another tool that aids in lidar system design is the EVAL-ADAL6110-16, a highly configurable evaluation system that provides a simplified yet configurable 2D flash lidar depth sensor for applications requiring real-time (65 Hz) object detection/tracking, such as collision avoidance, altitude monitoring, and soft landing.
The optics used in the reference design achieve a field of view (FOV) of 37° (azimuth) by 5.7° (elevation). Using a 16-pixel linear array in azimuth, the pixel size at 20 m is comparable to that of an average adult, 0.8 m (azimuth) by 2 m (elevation). As mentioned previously, different applications may require different optical configurations. If the existing optics do not meet the application requirements, the printed circuit board can be easily removed from the housing and integrated into the new optical configuration.
The evaluation system is built around ADI's ADAL6110-16, a low power, 16-channel, integrated LIDAR signal processor (LSP). The device provides timing control for illuminating the region of interest, timing for sampling the received waveform, and the ability to digitize the captured waveform. The integration of sensitive analog nodes by the ADAL6110-16 reduces the noise floor, allowing the system to capture very low signal returns, rather than using discrete components with similar design parameters to implement the same signal chain, where rms noise would dominate the design. In addition, the integrated signal chain allows LIDAR system designs to reduce size, weight, and power consumption.
The system software enables fast uptime to take measurements and start using the odometry system. It is completely standalone, powered by a single 5 V supply via USB, and can also be easily integrated into autonomous systems via the provided Robot Operating System (ROS) drivers. Users simply create a connector for the header to connect with their robot or vehicle and communicate via one of the four available communication protocols: SPI, USB, CAN, or RS-232. The reference design can also be modified for different receiver and transmitter technologies.
As mentioned previously, the receiver technology of the EVAL-ADAL6110-16 reference design can be modified to create different configurations, as shown in Figure 5 through Figure 7. The EVAL-ADAL6110-16 features a Hamamatsu S8558 16-element photodiode array. The pixel sizes at different distances shown in Table 1 are based on an effective pixel size of 0.8 mm × 2 mm and a 20 mm focal length lens. For example, if the same board is redesigned using individual photodiodes, such as the Osram SFH-2701, each with an active area of 0.6 mm × 0.6 mm, the pixel size at the same range will be very different as the FOV changes. Size of the pixel.
Previous article:Analysis of the design scheme of automobile air conditioning damper mode
Next article:ADAS Development Based on MATLAB and Simulink
Recommended ReadingLatest update time:2024-11-16 20:38
- Popular Resources
- Popular amplifiers
- LiDAR point cloud tracking method based on 3D sparse convolutional structure and spatial...
- GenMM - Geometrically and temporally consistent multimodal data generation from video and LiDAR
- Comparative Study on 3D Object Detection Frameworks Based on LiDAR Data and Sensor Fusion Technology
- A review of deep learning applications in traffic safety analysis
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Keys? No more! The future of car access systems is here
- [ESP32-Korvo Review] 05 Development Environment Construction Part 2
- SIMETRIX/SIMPLIS simulation tutorials and examples
- Development data of National Technology N32G455
- What is IoT
- [SC8905 EVM Review] + Found an error in the User Guide
- Goodbye 2019, Hello 2020+Summarize the past and look forward to the future
- [RVB2601 Creative Application Development] User Experience 09 -- YoC Event
- What communication technology is needed for the interactive transmission of RFID information?
- [Ready to use] LIS2DTW12 functional demonstration project (STM32G474)