Humans are visual animals, relying on their eyes to obtain visual information, identify direction and distance. When your car helps you drive, what are its "eyes"?
Editor: Perception Core Vision
The answer is onboard sensors. They continuously collect environmental information and then send it back to the brain of the smart car - the computing platform. The perception algorithm accurately reproduces the surrounding environment, and the decision algorithm then plans the vehicle's path based on the understanding of the surrounding environment.
Today we will talk about the most common sensors in the perception layer of intelligent driving systems. What are the differences between them? How can they complement each other?
Camera
Cameras are the most common automotive sensors. They are installed around the car body and can capture environmental images from multiple angles. They have been commercialized and gradually popularized since the 1990s. They are also the sensor closest to the human eye and can obtain rich color and detail information, such as lane lines, signs, traffic lights, etc. However, their limitations are also very obvious. If there is a situation that affects the "sight" due to dim light or backlight, the camera will not be able to see clearly like the human eye and will lose the target.
Backlight when exiting a tunnel may blind the camera
At the same time, the core of visual perception technology is to analyze high-density information through software algorithms, that is, to "identify" these objects and "estimate" their distance. If you encounter some "unknown" objects, such as special-shaped obstacles on the road, the system may make wrong decisions because it cannot fully and accurately perceive them.
For this reason, most intelligent driving solutions based on cameras remain at the L2 stage, and there are still many corner cases that cannot be perfectly solved for autonomous driving functions above the L3 level. At this time, other sensors are needed to "complement" them.
Millimeter wave radar
Millimeter wave radar is an active sensor that uses millimeter wave bands for ranging, detection, tracking, and imaging. It can actively emit electromagnetic waves, penetrate smoke and dust, and is almost unaffected by light and weather, helping vehicles to perceive surrounding objects in real time and provide relatively accurate distance and speed information.
However, the perception accuracy of millimeter-wave radar is not ideal, and it does not have image-level imaging capabilities. Because millimeter-wave radar uses reflection, diffuse reflection and scattering on the surface of the target object to detect and track the target, the detection accuracy will be greatly reduced for low-reflectivity targets such as pedestrians, animals, and bicycles, and static objects on the road may also be filtered out as clutter.
In addition, 4D millimeter-wave radar is actually a type of millimeter-wave radar, not a new species. Compared with traditional 3D millimeter-wave radar, 4D millimeter-wave adds height information, but the resolution is still far behind that of laser radar.
The 4D millimeter-wave radar currently on the market outputs about 1,000 points per frame, while a 128-line laser radar can output up to hundreds of thousands of points per frame. The amount of data output by the two differs by as much as 2 orders of magnitude.
LiDAR
LiDAR is also an active sensor. The most common ToF (Time of Flight) ranging method is to determine the distance and position by actively emitting laser beams and measuring the time it takes for them to reflect back and forth from surrounding objects. LiDAR can obtain the three-dimensional positioning information of these points by emitting millions of laser points per second to the outside world, clearly presenting the details of pedestrians, zebra crossings, vehicles, trees and other objects, achieving image-level resolution. Moreover, the denser the laser points, the higher the resolution, and the more complete and clear the real world can be reconstructed.
Due to its "active light-emitting" characteristic, LiDAR is very little affected by changes in ambient light, and can "precisely see" even in a dark night environment. In addition, LiDAR can directly obtain the volume and distance of an object, unlike cameras that rely on "guessing", so it can better detect small and irregular obstacles and deal with complex scenes such as close-range lane cutting, tunnels, and garages. However, the performance of LiDAR will also be affected to a certain extent in extreme weather such as heavy rain, snow, and fog.
In summary, the three most common sensors on smart cars, namely cameras, millimeter-wave radars and lidars, each have their own advantages and disadvantages. But when the three sensors are combined, they can play a greater role.
Cameras are passive sensors that can recognize rich color information, but are significantly affected by light and have relatively low confidence in some low-light environments. Millimeter-wave radars have high confidence, but due to their low resolution, they can recognize fewer types of objects and cannot effectively sense pedestrians, bicycles, or smaller objects. LiDAR has excellent comprehensive strength in terms of ranging capability, confidence, and perceptible object details. Perhaps its biggest drawback is that it is a bit "expensive."
But there is no doubt that the cost of LiDAR is decreasing rapidly, and more and more automakers have integrated LiDAR into mass-produced models to improve the safety and comfort of intelligent driving systems. As chip technology continues to evolve, the number of LiDAR components has been greatly reduced, and the efficiency of automated production lines has increased, which also helps to reduce production costs. LiDAR has dropped from the million-yuan level a few years ago to the current few thousand yuan level, becoming a "black technology for cars" that ordinary consumers can also use.
Finally, it comes down to user needs. Whether an intelligent driving system is "easy to use" or not is still determined by the users. At the perception level, sensors have their own strengths and weaknesses. The key is how to make them "perform their respective functions" better and give full play to their respective advantages. With the continuous iteration of intelligent driving software capabilities, I believe that more hardware potential will be gradually "unlocked" in the future, thereby giving consumers a smoother and more comfortable intelligent driving experience.
Previous article:ISO 26262: Automotive functional safety standard for ensuring driving safety
Next article:Working Principle of Automobile Relay
Recommended ReadingLatest update time:2024-11-16 09:22
- Popular Resources
- Popular amplifiers
- Analysis and Implementation of MAC Protocol for Wireless Sensor Networks (by Yang Zhijun, Xie Xianjie, and Ding Hongwei)
- Introduction to Internet of Things Engineering 2nd Edition (Gongyi Wu)
- 西门子S7-12001500 PLC SCL语言编程从入门到精通 (北岛李工)
- Modern Motor Control Technology (Wang Chengyuan, Xia Jiakuan, Sun Yibiao)
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- EEWORLD University - 10 hours to learn image processing opencv introductory tutorial
- Single chip LCD1602 display program-controlled measurement amplifier
- Simple laser flashlight circuit diagram
- GD32E230C Test 1: Simple Date and Time Display
- What problems can the high-performance H7 chip solve?
- The Secret of Operational Amplifiers!
- Zero-knowledge open source sharing-temperature and humidity module SHT3X SHT30 SHT31 SHT35
- Why does the waveform of the bubble sensor in medical infusion pipeline have two segments?
- How to test and debug the frequency accuracy of radio frequency
- Programming Techniques in FPGA Design