Regardless of whether it is L2.9 or L3+ in the car companies’ slogans, there are indeed more and more sensors installed in cars.
Waymo, the world leader in driverless cars, has a fifth-generation autonomous driving system with 40 sensors, including 29 cameras, 6 millimeter-wave radars and 5 lidars. Domestic new car manufacturers are not far behind in terms of car intelligence and automation. NIO ES8 is equipped with 23 sensors, and Xpeng P7 has 31 sensors, surpassing Tesla.
With the advent of the intelligent transformation of automobiles, global automotive technology companies, regardless of size, are committed to removing the steering wheel of the car or partially reducing the role of human drivers. Some players are bravely climbing to the top and attacking L4 autonomous driving, while others are gradually starting from L1 and L2 autonomous driving, but the market cake of vehicle-mounted sensor solutions has been getting bigger and bigger.
The "2020 Autonomous Driving Sensor Report" released by market research firm Yole Développement predicts that sensors used in autonomous vehicles will grow at a compound annual growth rate of 51% in the next 15 years, and the total revenue of sensing hardware will reach US$17 billion in 2032, equivalent to RMB 111.5 billion.
In the trillion-level market for automotive sensors, China has lost the initiative in research and development and market share. The automotive sensor industry, including cameras, millimeter-wave radars and lidars, has been monopolized by foreign brands for many years. However, some domestic brands are gearing up to try to regain the status of domestic sensors.
How do cameras, millimeter-wave radars, and lidars enable cars to perceive the environment like humans? Can cars achieve fully autonomous driving by relying solely on image recognition? In addition to vehicle-side sensors, how will road-side sensors help self-driving cars land faster?
The "eyes" and "ears" of the car
Compared with traditional cars, the main function of the perception system of self-driving cars is to replace the visual system of human drivers: by converting the sensed information into electrical signals according to specific rules, and transmitting them to the car's central control unit to assist the car in autonomous driving.
The ultimate goal of autonomous driving is to make cars smarter. As for how artificial intelligence can help humans, there is a common saying in the industry: the areas where current artificial intelligence systems excel are exactly the opposite of what humans excel in. Humans have far superior logical analysis capabilities than AI, but they are far behind AI in areas such as memory and big data analysis.
The same phenomenon also exists in the car perception system. Human drivers can identify pedestrians, vehicles, traffic lights, etc. on the road with just common sense and cognition, and make corresponding decisions such as acceleration, deceleration, and steering. However, it is very difficult for computers to complete the same operation, and perception and recognition are the first hurdle.
Currently, the core sensors for autonomous driving include on-board cameras, millimeter-wave radars, and lidars. Cameras and millimeter-wave radars are the main sensors in ADAS systems, while lidars have become a must-have for most autonomous driving vehicles above level L3.
The car-mounted camera plays the role of human "eyes".
▲ The picture comes from the Internet
As the most indispensable sensor in autonomous driving, the camera can distinguish the size and distance of obstacles, identify pedestrians, lane lines, traffic signs, etc., and analyze image information through algorithms to achieve many early warning and recognition functions, such as pedestrian warning, lane keeping, traffic light recognition, etc. According to the number of lenses, cameras can be divided into monocular, binocular and multi-camera.
The main advantages of cameras are their high resolution and low cost. The human eye can quickly capture a large amount of information, and cameras can also obtain rich information, but like the human eye, they are affected by the field of view and the environment. A monocular camera can only capture a range of 50° at most, and the observation distance is limited; the performance of the camera will drop rapidly at night and in bad weather such as rain and snow.
In March 2018, an Uber self-driving car collided with a woman crossing the road in Arizona, killing her. The main reason was that the lighting conditions were poor at night and the road section was in the shadow, so the car failed to accurately identify the pedestrian.
Millimeter-wave radar makes up for the shortcomings of cameras. Compared with human eyes, it is more like a bat's ears: bats hardly rely on their eyes, but emit ultrasonic waves through their ears, identify objects based on the reflected echoes, and fly around obstacles, so they are not affected by light conditions.
Similar to the flight principle of bats, millimeter-wave radar uses an antenna to emit millimeter waves with a wavelength of 1-10mm and a frequency of 24-300GHz. It obtains environmental information such as the relative distance and relative speed between the car and other objects by processing the target reflection signal, and tracks and classifies the target based on the information. The electronic control unit makes decisions based on the dynamic information of the vehicle body.
The advantage of millimeter-wave radar is its strong anti-interference ability and its ability to penetrate rainfall, dust, smoke and other plasmas is higher than laser and infrared. However, it also has defects such as large signal attenuation, easy obstruction by buildings, and short transmission distance.
LiDAR works in a similar way to radar, but its biggest advantage is that it can use Doppler imaging technology to create clear 3D images of targets.
The distance is determined by measuring the time difference and phase difference of the laser signal, and the three-dimensional coordinates, reflectivity, texture and other information of a large number of dense points on the surface of the target object collected in this process are used to quickly obtain the three-dimensional model of the target and various related data such as lines, surfaces, and volumes, so as to achieve the purpose of environmental perception.
Zhao Xin, director of the safety and quality engineering department of Hesai Technology, a domestic laser radar manufacturer, told Chuxingyike (ID: carcaijing) that laser radar is an indispensable sensor for cars to achieve autonomous driving, especially for L4 and above. Laser radar has obvious advantages, including high resolution, high accuracy, and strong anti-interference ability. The more laser radar lines, the higher the measurement accuracy and the higher the safety.
"Whether it is detection accuracy, information richness or actual perception of the outside world, it is essential for unmanned vehicles," said Zhao Xin.
LiDAR is a precision instrument, and its working principle involves multiple professional disciplines. In particular, top-level leading enterprises have many years of deep cultivation and accumulation in related fields, and mature products have high precision, which leads to high cost and higher price of LiDAR. However, as intelligent hardware on self-driving cars, the development of the entire self-driving industry and the opening up and cooperation of the industrial chain have become important forces to promote the cost reduction of LiDAR.
On the other hand, the more lines a lidar has, the more environmental details it can perceive, the richer the amount of point cloud data it receives, and the higher the hardware and software capabilities required. Having sufficient computing power to process the environmental information captured by the sensor has become an important part of the autonomous driving solution.
Foreign companies rule, domestic brands challenge
The automotive sensor industry was once a field dominated by overseas manufacturers.
As autonomous driving technology is gradually promoted, the market demand for vehicle cameras, millimeter-wave radars and lidars is also increasing rapidly. Due to the high technical barriers and high requirements for both hardware and software systems, foreign Tier 1 companies have obvious first-mover advantages in R&D, brand trust and market share.
Among the suppliers of camera-based image recognition technology, Israeli company Mobileye occupies an absolute dominant position. Mobileye provides vehicle manufacturers with an overall solution of "on-board camera + algorithm + visual processing chip". In 2019, it held 45 cooperation projects with 26 global automakers and received new orders for more than 16 million vehicles for 22 models, with a market penetration rate of more than 70%.
In the field of millimeter-wave radar, due to the monopoly of key software and hardware technologies by foreign companies, the global millimeter-wave radar market is mainly dominated by international giants such as Bosch, Valeo, Hella, Continental, Delphi, and Denso. According to statistics from OFweek, a comprehensive portal for China's high-tech industry, the top three companies in the global millimeter-wave radar market in 2018 were Bosch, Continental, and Hella, with market shares of 19%, 16%, and 12%, respectively.
The market monopoly of LiDAR is even more obvious. Velodyne Technologies, a US company that once mastered the core technology, is almost synonymous with LiDAR. Founded in 1983, Velodyne was once the only choice for domestic companies that provide driverless technology. It has cooperated with Google, General Motors, Ford, Uber, Baidu, etc., and has occupied most of the market share of automotive LiDAR.
In recent years, along with the wave of automobile intelligence in China, the market share of vehicle-mounted sensors has become bigger and bigger, and domestic challengers have also risen accordingly.
At the 2020 World Intelligent Connected Vehicle Conference held recently, the "Intelligent Connected Vehicle Technology Roadmap 2.0" was officially released, planning that by 2025, intelligent connected vehicles with L2 and L3 autonomous driving will account for 50% of the total vehicle sales in my country. CITIC Securities predicts that the demand for sensors will grow exponentially, and the automotive sensor market is expected to exceed 34 billion yuan in 2023.
Previous article:Mobileye plans to develop its own FMCW lidar to reduce the cost of autonomous driving
Next article:Vayyar launches multi-function radar-on-a-chip to reduce complexity and cost
- Popular Resources
- Popular amplifiers
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- How much do you know about intelligent driving domain control: low-end and mid-end models are accelerating their introduction, with integrated driving and parking solutions accounting for the majority
- Foresight Launches Six Advanced Stereo Sensor Suite to Revolutionize Industrial and Automotive 3D Perception
- OPTIMA launches new ORANGETOP QH6 lithium battery to adapt to extreme temperature conditions
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions
- TDK launches second generation 6-axis IMU for automotive safety applications
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Three steps to govern hybrid multicloud environments
- Three steps to govern hybrid multicloud environments
- Microchip Accelerates Real-Time Edge AI Deployment with NVIDIA Holoscan Platform
- Microchip Accelerates Real-Time Edge AI Deployment with NVIDIA Holoscan Platform
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Molex leverages SAP solutions to drive smart supply chain collaboration
- Pickering Launches New Future-Proof PXIe Single-Slot Controller for High-Performance Test and Measurement Applications
- Apple faces class action lawsuit from 40 million UK iCloud users, faces $27.6 billion in claims
- Apple faces class action lawsuit from 40 million UK iCloud users, faces $27.6 billion in claims
- [RISC-V MCU CH32V103 Review] SPI drives OLED LCD display
- [GD32L233C-START Evaluation] 2. Keil development environment construction
- Circuit Design of Robot Positioning System Based on MSP430
- 【TI Course】LED Driver
- Is the DLPLCRC900EVM a revised version?
- Schedule of the 2019 National Undergraduate Electronic Design Competition
- [2022 Digi-Key Innovation Design Competition] Material Unboxing - ESP32 Three Musketeers
- [Sipeed LicheeRV 86 Panel Review] D1 as a single-chip computer naked programming light - LED-Blinky
- Upload two photosensitive detection circuits. The target detects different voltages according to different light exposure. Which one is correct?
- MSP430F149, HC_SR04 distance measurement