LiDAR has become very popular recently.
For cars to achieve autonomous driving, they need sensors such as cameras, millimeter-wave radars and lidar to replace human eyes and perceive the surrounding environment. By analyzing and processing this perceived information, the car can make autonomous decisions on how to turn, change lanes, brake, etc.
There has long been a dispute over two paths in the perception solution for driverless cars: one is dominated by cameras, combined with low-cost components such as millimeter-wave radars, which constitutes pure visual computing, with typical representatives such as Tesla and Mobileye; the other is dominated by lidar, combined with cameras, millimeter-wave radars and other components, with typical representatives such as Waymo and other mainstream autonomous driving manufacturers and driverless taxi startups.
Tesla CEO Elon Musk has always insisted that LiDAR is "expensive and unnecessary", and the high price of LiDAR has always been a major factor limiting its mass production. NIO, Li Auto and Xpeng, three new car manufacturers, have invested a lot in the field of autonomous driving, but none of them have included LiDAR in their sensor systems for autonomous driving.
Xiaopeng Motors is the first. Recently, Xiaopeng Motors announced that it will upgrade its autonomous driving software and hardware systems starting with the models produced in 2021, using LiDAR technology to improve performance. Currently, the Xiaopeng P7 is equipped with 31 autonomous driving sensors. On this basis, NIO will add LiDAR as a safety redundancy to "create the strongest hardware redundancy design in history."
Mass production of LiDAR for vehicles has become a new battleground in autonomous driving.
Huawei, which "does not build cars, but helps car companies build better cars", has already entered the market, with the goal of quickly developing a 100-line LiDAR in the short term. According to Huawei, its standard high-level autonomous driving solution uses three LiDARs, and when performance is improved and there is demand in the future, its intelligent driving platform can connect to up to eight LiDARs, which can support L4 autonomous driving capabilities.
Wang Jun, president of Huawei's Intelligent Automotive Solutions BU, said that Huawei plans to reduce the cost of lidar to US$200 or even US$100 in the future.
At the Internet Car Wuzhen Night Talk Forum held recently, Yu Liguo, Vice General Manager of BAIC New Energy and President of ARCFOX BU, also "spoke to Tesla" after He Xiaopeng, saying that LiDAR will be installed in vehicles for delivery next year. It is understood that ARCFOX and Huawei have jointly created a new HBT car, which will be equipped with three 96-line LiDARs and a Huawei chip with a computing power of up to 352 trillion times per second.
He Xiaopeng and Yu Liguo said that they will continue to compete head-on with Tesla in overseas markets. "We will meet in the international market."
Is Tesla's "latest neural network computer" more effective, or is the LiDAR that will soon be mass-produced in China more reliable? With LiDAR being installed in new Chinese cars, will it become a standard feature of smart cars in the next step?
Which is better, image recognition or lidar?
Whether it is necessary to carry lidar has always been one of the focal points of debate in the field of autonomous driving.
At present, the core sensors for autonomous driving include vehicle-mounted cameras, millimeter-wave radars, and lidars. Cameras and millimeter-wave radars are the main sensors in ADAS systems, while lidars have become a must-have for most autonomous driving vehicles above level L3.
The main advantages of vehicle-mounted cameras are their high resolution and low cost. The human eye can quickly capture a large amount of information, and cameras can also obtain rich information, but like the human eye, they are affected by the field of view and the environment. A monocular camera can only capture a maximum of 50°, and the observation distance is limited; the performance of the camera will drop rapidly at night and in bad weather such as rain and snow.
In March 2018, an Uber self-driving car collided with a woman crossing the road in Arizona, killing her. The main reason was that the lighting conditions were poor at night and the road section was in the shadow, so the car failed to accurately identify the pedestrian.
LiDAR makes up for the lack of environmental information perception of cameras. Its biggest advantage is that it can use Doppler imaging technology to create a clear 3D image of the target. The distance is determined by measuring the time difference and phase difference of the laser signal, and the three-dimensional coordinates, reflectivity and texture information of a large number of dense points on the surface of the target object collected in this process are used to quickly obtain the three-dimensional model of the measured target and various related data such as lines, surfaces, and volumes to achieve the purpose of environmental perception.
Compared with cameras, lidar can effectively identify and distinguish planes and three-dimensional objects, and has stronger detection accuracy, information richness and actual perception of the outside world. However, since lidar is a precision instrument, leading companies have many years of deep cultivation and accumulation in related fields, and their mature products have very high precision, resulting in higher costs and more expensive prices for lidar.
Tesla's autonomous driving solution insists on image recognition as the leading factor, and realizes high-level autonomous driving scenarios through cameras + ultrasonic sensors covering the entire vehicle and a front radar. The recently launched rewritten version of the Full Self-Driving (FSD) kit further demonstrates its technical advantages in visual recognition.
Musk has repeatedly questioned the "redundancy" of the LiDAR solution. "It's like the human appendix," he once said in a tweet. Musk believes that humans drive safely by collecting information through vision and processing information with the brain, which means that autonomous driving can also be achieved through the same visual perception + algorithm decision-making.
In this regard, Wang Xiaobin, an automotive expert at the Human-Vehicle Relationship Laboratory of the School of Automotive Engineering at Tongji University, told Travel.com that from Musk's point of view, if Tesla can really make its car cameras perceive and judge exactly like humans, that is, with sufficiently powerful cameras and algorithms, in theory, fully autonomous driving can be achieved through image recognition solutions.
"The starting point is correct, but can machines really observe and think exactly like humans? The difficulty now is whether the camera can achieve the same performance as the human eye and whether the information obtained is sufficient to support judgment." Wang Xiaobin said: "When people see a pothole ahead or rain or snow, they will be more cautious and slow down accordingly. Although the camera can capture the pothole ahead, it is a question whether it can extract information and identify it 100%."
In fact, many R&D teams tried to fool Tesla's cameras when testing its autopilot system. Unfortunately, these machines are really "easy to fool."
In February, a research paper published by an overseas university student, titled "Phantom ADAS: Phantom Attacks on Driver Assistance Systems," pointed out Tesla's flaws. Their team found in a test that projecting a humanoid image onto the road could cause the Tesla Model X to slow down. Projecting fake lane markings on the ground would cause the Model X to temporarily ignore the physical lane lines of the road.
It is not difficult to see that the two-dimensional image recognition completed by the camera has largely led to Tesla's automatic driving system making wrong judgments.
Even more astonishing is that another research team recently discovered that inserting a no-entry traffic sign for less than one second into an advertising video can also allow Tesla AutoPilot to capture it and make the decision to brake.
It is undeniable that technology has given cameras a perception capability superior to that of the human eye, and humans may not even notice an image that appears for less than a second, but technology has not given cameras the awareness to think and judge - no one will take a traffic sign that flashes by in a video ad seriously.
Zhao Xin, director of the safety and quality engineering department of Hesai Technology, a domestic laser radar manufacturer, told Travel One that laser radar is an indispensable sensor for cars to achieve autonomous driving, especially for L4 and above. Laser radar has obvious advantages, including high resolution, high accuracy, and strong anti-interference ability. The more laser radar lines, the higher the measurement accuracy and the higher the safety.
The dynamic changes in the industry chain also reflect the necessity of LiDAR. "There are more LiDAR manufacturers, and more car manufacturers are starting to use LiDAR, which may be more convincing than discussing the technical route," said Zhao Xin.
LiDAR will be used in new domestic cars
In October, Huawei's rotating chairman Xu Zhijun shared Huawei's ambitions for the automotive industry, and made it clear that Huawei would build core sensors for smart cars, such as lidar and millimeter-wave radar, and create a new sensor ecosystem.
Huawei said that the ADS "advanced" autonomous driving solution will be available on mass-produced vehicles in Q1 2022. The ADS system is built with L4 autonomous driving technology, using 2 to 3 automotive-grade 100-line hybrid solid-state laser radars, with more than a dozen cameras and 6 millimeter-wave radars, which is no less than L4 driverless taxis. The cost-effective planning of "using L4 as L2" may be one of the routes to make autonomous driving come into being faster.
At the same time, Xiaopeng Motors' laser radar mass production route has also resonated with other new energy vehicle companies.
Previous article:Quantum sensors could allow self-driving cars to 'see' objects around corners
Next article:Is it the battery that limits the production capacity of new energy vehicles? No, it is the chip!
Recommended ReadingLatest update time:2024-11-16 14:54
- Popular Resources
- Popular amplifiers
- LiDAR point cloud tracking method based on 3D sparse convolutional structure and spatial...
- GenMM - Geometrically and temporally consistent multimodal data generation from video and LiDAR
- Comparative Study on 3D Object Detection Frameworks Based on LiDAR Data and Sensor Fusion Technology
- A review of deep learning applications in traffic safety analysis
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- The use of stack in DSP
- Cubietruck dual-core A20 development board
- TI DSP simulator type selection
- The display principle of the segment code screen, for example, to display the word high voltage, which two pins are driven, and high voltage is a character, isn't it...
- Iwatch Apple Watch Wireless Charging
- New Forum Homepage User Guide
- EEWORLD University ----TI Automotive Instrument Solutions
- CC2541 Bluetooth Watchdog Mode
- Bluetooth module interfaces
- Lost, lost, lost