Recently, two pieces of news about whether Tesla will use radar as an autonomous driving sensor have attracted attention in the industry. One piece of news is that Tesla’s US models Model 3/Model Y have cancelled the millimeter-wave radar; the other piece of news is that although Tesla insists on not using lidar, it is testing the Luminary lidar system on the road.
At present, the industry generally believes that radar + camera sensor fusion is a reasonable path to achieve autonomous driving. As a leader in the practice of electric vehicles and autonomous driving, will Tesla use lidar in the future, or continue to be "unique" - not using radar and continuing to use only visual sensors (cameras) to achieve autonomous driving?
With such questions in mind, "Car Industry Talk" interviewed Fang Yuan, a senior expert in the automotive industry and the exclusive special automotive industry commentator of "Car Industry Talk"'s automotive new media platform.
01
"Car Industry Talk": What do you think about the two recent news about Tesla's attitude towards millimeter-wave radar and lidar?
Fang Yuan: I have also seen these two news reports. For Tesla, these two events can be regarded as milestone events. To explain these two events clearly, I think we should first discuss them from the perspective of the requirements of the sensor/perception system for autonomous driving environment perception.
Generally speaking, the perception system of the ideal state of autonomous driving environment perception roughly includes the following four aspects:
1. There are enough perception data/point clouds/pixels at each moment or in other words, the spatial resolution is high enough - similar to the images seen by the human eye for analysis and decision-making.
2. The data provided by the perception system can provide quantitative distance information of the perceived object (can measure the distance of the object).
3. The perception system is not affected by day or night, or by bad weather (fog, rain, snow, glare, etc.). It is an all-day, all-weather sensor.
4. The perception distance should be far enough so that in case of emergency, there is enough time to make judgments, decisions and executions to avoid accidents. This distance is expected to be at least 150 meters.
However, in the past, no single type of sensor/perception system could meet all four of the above requirements at the same time. Therefore, almost all current autonomous driving environment perception systems combine several types of sensors to achieve the so-called perception fusion/data fusion, that is, combining the strengths/advantages of various sensors/perception systems (such as pure visual image perception systems, millimeter wave radars, lidars, etc.) to achieve the above goals.
However, due to technological advances, pure visual image perception systems can now independently complete the above four perception tasks and eliminate glare, etc.
Comparing the two tables, we can see that the pure visual image perception system can replace millimeter-wave radar and lidar to a large extent, which should be the main reason why Tesla's US Model 3/Y canceled the millimeter-wave radar configuration. Compared with the pure visual image perception system, the two inherent shortcomings of millimeter-wave radar (low image data/point cloud/pixel density and low reflectivity of non-metallic surface objects) are almost impossible to overcome, but its advantages can be replaced by the pure visual image perception system.
Regarding Tesla's road test of Luminar LiDAR, I think it is testing its current performance indicators and progress, such as how the density of data/point cloud/pixel has improved/grown. It is expected that Tesla will continue to test LiDAR on the road, especially Flash LiDAR, which needs to be discussed in detail, because Flash LiDAR is also a pure visual image perception system - collecting data through camera visual images (rather than scanning). Flash LiDAR cameras can be incorporated into the currently commonly used pure visual image perception system cameras.
At present, the camera of pure visual image perception system has a very important specialty/advantage. It has a lot of data points per moment/per frame of perception data/point cloud/pixel, and the spatial resolution is very high. It can clearly see road signs (speed signs, warning signs, etc.), routes (solid lines, dotted lines, zebra crossings, etc.), traffic lights, etc. At the same time, because it is an RGB (multi-wavelength, multi-color) camera, it has a good reflection effect on the surface of various objects. Now its distance perception is also far enough (far more than 150 meters, usually up to 300 meters, and up to 1600 meters), and it can also directly measure the distance (up to centimeter level), and it can also "penetrate" fog, rain, snow, glare, etc., and can replace millimeter-wave radar and lidar.
02
Automotive Industry Talk: Regarding the four performances of the perception system just mentioned, can you talk about the millimeter wave radar, lidar, and pure visual image perception system in detail, which one is the best and meets the requirements of the autonomous driving environment perception system? Let's talk about the first performance first.
Fang Yuan: In terms of the first performance (spatial resolution/data points/pixels/point cloud density), the pure visual image perception system is the best and meets the requirements of the autonomous driving environment perception system.
This performance is crucial, as it is closest to human vision performance, which is why Tesla has always insisted that pure visual image systems can accomplish the task of autonomous driving environment perception.
One of its logical foundations is that humans basically drive cars based on visual images; the second is that the real-time information flow (bitstream/s – bit/s) of the perception system is the main, core, and key indicator; the third is that the performance of the pure visual image system is also constantly expanding, and it can also measure distances and achieve all-weather and all-day perception tasks. Based on these three points, the pure visual image system has incomparable advantages over conventional laser radars and millimeter-wave radars - the spatial resolution/data points/pixels/point cloud density of the pure visual image system can often reach about 100 times or even higher than that of laser radars, and is higher than that of millimeter-wave radars. The difference can be clearly seen from the comparison of the pixels/point cloud images of the three below.
03
"Automotive Industry Talk": Regarding the second performance, it is generally believed that millimeter-wave radar or lidar can provide distance information of objects in front, which the previous pure visual image perception system often cannot provide. What do you think?
Fang Yuan: As for the second performance, the distance information of the objects in front and around is very important for autonomous driving decision-making and path planning. But at present, in addition to LiDAR and millimeter wave radar, pure visual image perception systems can also measure distance, such as using monocular, binocular (similar to human binocular/binocular), trinocular or multi-ocular methods, TOF and other methods.
For example, the 2019 IEEE International Conference on Computer Vision (ICCV2019) Best Paper Nomination Paper "Gated2Depth: Real-time Dense Lidar from Gated Images-High-density Real-time Shutter Framing Laser Visual Image Ranging System", the 2019 IEEE International Conference on Computer Vision and Pattern Recognition (CVPR2019) Baidu publicly disclosed the Apollo Lite pure visual image perception system L4 autonomous driving 3D perception solution (does not rely on lidar and has higher manufacturability), the 2020 CVPR2020 Cornell University and Hong Kong University of Science and Technology jointly published the paper "Depth Sensing Beyond Lidar Range-Distance Measurement "Visual" Perception System Beyond Lidar Ranging", and the article published by Tsinghua University at CVPR 2021 in 2021-"Monocular Real-time Full Body Motion Capture" and other papers provided references.
Taking the article jointly published by Cornell University and Hong Kong University of Science and Technology in 2020 as an example, the article gives a comparison of 300-meter distance measurement data using a three-eye camera pure visual image perception system on a truck. The actual distance is 302 meters, the measured distance is 300.8 meters, and the relative error is 0.4%. Although the accuracy of pure visual image distance measurement is not as high as that of lidar (single digit centimeters), it is close to millimeter-wave radar (double digit centimeters). If the relative error estimate is given according to the above example, for a braking distance of about 40 meters, the measurement error is about 16cm.
Therefore, I believe that as measurement technology advances, the accuracy of distance measurement by pure visual image systems will approach and be equal to that of lidar.
Baidu Apollo Lite/ANP (Apollo Navigation Pilot) distance measurement road real car test demonstration picture
In addition, 6D pose estimation can also be performed from a single monocular RGB visual image, which can further improve the utilization of image information, such as the article "GDR-Net: Geometry-Guided Direct Regression Network for Monocular 6D Object Pose Estimation" published in CVPR 2021.
04
"Automotive Industry Talk": As for the third performance, it is generally believed that the biggest feature of millimeter-wave radar is that in addition to ranging, it also has strong "penetration" and can "penetrate" certain rain, fog, and snow. In the past, pure visual imaging systems lacked this ability. What do you think?
Fang Yuan: Indeed, since the wavelength of millimeter-wave radar is longer than that of ordinary visible light, it has a strong "penetrating power" against certain rain, fog, snow, etc. However, the current pure visual image perception system can extend the wavelength of visible light to the near-infrared band, and with the appropriate algorithm, it can already have "penetrating power" and "penetrate" rain, fog, snow, etc.
Previous article:Renault, Arrival and STMicroelectronics reach chip supply agreement
Next article:Local high-power automotive-grade IGBTs successfully put into mass production, accelerating localization of "automotive chips"
- Popular Resources
- Popular amplifiers
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- How much do you know about intelligent driving domain control: low-end and mid-end models are accelerating their introduction, with integrated driving and parking solutions accounting for the majority
- Foresight Launches Six Advanced Stereo Sensor Suite to Revolutionize Industrial and Automotive 3D Perception
- OPTIMA launches new ORANGETOP QH6 lithium battery to adapt to extreme temperature conditions
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions
- TDK launches second generation 6-axis IMU for automotive safety applications
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Microchip Accelerates Real-Time Edge AI Deployment with NVIDIA Holoscan Platform
- Microchip Accelerates Real-Time Edge AI Deployment with NVIDIA Holoscan Platform
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Molex leverages SAP solutions to drive smart supply chain collaboration
- Pickering Launches New Future-Proof PXIe Single-Slot Controller for High-Performance Test and Measurement Applications
- Apple faces class action lawsuit from 40 million UK iCloud users, faces $27.6 billion in claims
- Apple faces class action lawsuit from 40 million UK iCloud users, faces $27.6 billion in claims
- The US asked TSMC to restrict the export of high-end chips, and the Ministry of Commerce responded
- The US asked TSMC to restrict the export of high-end chips, and the Ministry of Commerce responded
- EEWORLD University Hall----Live Replay: TI Sitara? Latest AM64X Platform Introduction
- [Project source code] FPGA-based digital tube font, used to display numbers on the LCD screen
- MSP430g2553 hardware UART (modification based on official routines)
- TMS320F28335 startup process
- Antenna transmission line standing wave conversion conjugate matching
- RF-Radio Frequency Understanding
- DSP28335-ePWM
- National Technology N32G430 Development Board Review Unboxing
- 【LSM6DSOX's MLC machine learning understanding】--Sharing machine learning tutorial
- USB to canfd analysis and development