The advent of autonomous driving is not just a leap forward in automotive technology, but also a revolution in the way people perceive and interact with automotive mobility. At the heart of this transformation is sensor fusion.
In autonomous vehicles, sensor fusion is a technology that is critical to vehicle safety and efficiency. It combines data from multiple sensors and produces more accurate, reliable and comprehensive information about the vehicle's environment. For example, sensors such as cameras, radars and lidar work together and compensate for each other to provide a 360° panoramic view around the vehicle. At the same time, the application and popularity of ADAS and autonomous driving technologies have further stimulated unprecedented demand for sensors.
According to data from TE Connectivity, vehicles with the latest ADAS/AV technology, even typical non-electric vehicles, require 60 to 100 on-board sensors, of which 15 to 30 are dedicated to engine management. Commercial trucks are equipped with as many as 400 sensors, and about 70 sensors are used for engine management. It is expected that the next few generations of electric vehicles, especially those equipped with autonomous or semi-autonomous driving functions, will likely have two to three times the number of sensors as other models.
The emergence of autonomous driving technology has rapidly changed the existing transportation landscape, providing vehicles with unprecedented safety and reliability. Sensor fusion is a game-changer at the heart of autonomous vehicles. By seamlessly integrating data from a variety of advanced sensors, including cameras, radars, lidars, and ultrasonic sensors, sensor fusion can perceive the environment from different locations, providing more accurate and reliable information about the vehicle and its surroundings than any single source.
Sensors in autonomous driving
Early applications of sensors in cars were mainly based on basic advanced driver assistance systems (ADAS) with rearview cameras. As the level of autonomous driving increases, the level of vehicle intelligence has greatly increased, and the types and number of sensors required are also increasing. Currently, the following are the main types of sensors used in autonomous driving technology:
Camera (camera)
This is the sensor closest to human vision, which can be used to detect visual information around the vehicle, such as traffic signs, lane markings, pedestrians, cyclists, and other vehicles. Front cameras allow the car to "see" where it is going, and reversing cameras can help with parking and reversing. Some new models are also equipped with 360° cameras, which are miniature cameras placed around the body of the car to obtain a bird's-eye view of the surrounding environment.
There are many varieties of such cameras on the market, which can basically meet the needs of automotive use. Take the ON Semiconductor AR0820AT as an example. This is a 1/2-inch CMOS digital image sensor with a 3848 H x 2168 V active pixel array. This advanced automotive sensor can capture images in linear or high dynamic range, and also has rolling shutter readout. In addition, the AR0820AT is also optimized for high dynamic range scene performance in low light and harsh conditions. It uses 2.1 µm DR-Pix BSI pixels and on-chip 140dB HDR capture capability, which is very helpful for obtaining image information around the vehicle.
Radar
To ensure the safe driving of the vehicle, whether it can obtain high-resolution images and even 4D imaging data of the vehicle's surrounding environment in a timely manner is a major challenge for intelligent driving. At this time, millimeter-wave radar, which plays a key role in ADAS functions, comes to the fore because it can "see" objects more clearly than cameras, has higher resolution and performance, and good directivity, and is not easily affected by environmental interference or weather. It should be noted that millimeter-wave radar cannot be used to identify non-metallic objects.
The TIDA-020047 dual-device millimeter-wave cascade radar reference design provided by TI for automotive 4D imaging radar needs perfectly solves the problem of "seeing clearly" in ADAS functions by combining two 76GHz to 81GHz radar transceivers, a radar processor, two CAN-FD PHYs, an Ethernet PHY and a low-noise power supply.
The AWR2243 device in the reference design is an integrated single-chip FMCW transceiver capable of operating in the 76GHz to 81GHz frequency band. It achieves ultra-high integration in an extremely small package, and implements a monolithic 3TX, 4RX system with built-in PLL and ADC converters. Simple programming model changes can
support a variety of sensor deployments, including short-range, medium-range and long-range, forming a multi-mode sensor solution. The AM273x, which undertakes the task of radar processor, is a highly integrated, high-performance microcontroller based on the Arm Cortex-R5F and C66x floating-point DSP cores. The internally integrated hardware security module (HSM) ensures the functional safety of the vehicle.
Figure 1: TIDA-020047, a dual-device millimeter-wave cascade reference design for automotive 4D imaging radar (Source: TI)
LiDAR
Light Detection and Ranging (LiDAR) uses light pulses to measure the distance between the vehicle and other objects. With this information, the vehicle can create a detailed 3D map based on the environment. LiDAR has many advantages for autonomous vehicle sensors: first, it has excellent distance, angle and speed resolution and strong anti-interference ability; second, LiDAR can obtain a large amount of data and information that is beneficial to autonomous driving, including distance, angle, speed and reflection intensity of objects to generate multi-dimensional images of objects. At present, the high price has affected the large-scale application of LiDAR in the automotive industry to a certain extent.
OSRAM SPL SxL90A LiDAR is a cost-effective product that meets the AEC-Q102 standard. It allows autonomous vehicles to "see" farther and drive safer and more efficiently. The product is available in two series: single-channel and four-channel, both with 40A current and 125W performance per channel, with an efficiency of up to 33%, and very low thermal resistance, so it is not easy to heat up even when running at high current. The single-channel device, the SPL S1L90A_3 A01, is very compact at just 2.0mm x 2.3mm x 0.65mm. The quad-channel device, the SPL S4L90A_3 A01, has four emission zones and delivers excellent optical power at 480W, in a slightly larger size than the single-channel device, but allows for a greater detection range.
Figure 2: Four-channel LiDAR SPL S4L90A_3 A01 (Source: AMS OSRAM)
3D ToF LiDAR
Time of Flight (ToF) is a type of LiDAR suitable for short-range automotive use cases, which can obtain more details without scanning. This is an increasingly popular type of LiDAR, which has been widely used in smartphone applications. In the automotive environment, high-resolution ToF cameras use 3D sensing technology to scan the area around the car and the ground, detecting curbs, walls or other obstacles regardless of lighting conditions, supporting gesture recognition and building a 360° view outside the car to assist in self-parking.
The IRS2877A is a product in Infineon's REAL3 ToF LiDAR series mainly for automotive applications. It adopts a 9 x 9 mm² plastic BGA package and uses a 4 mm micro-photosensitive area to achieve a VGA system resolution of 640 x 480 pixels. Only one ToF camera is needed to create a driver status monitoring system with 3D facial recognition. Based on the 3D body model generated by the IRS2877A, the body shape and weight of the occupants can also be accurately estimated, and highly accurate occupant and seat position data can be obtained, providing key information for the deployment and restraint system of smart airbags. In addition to safety applications, 3D ToF sensors can also be used to implement functions such as in-vehicle gesture control.
Figure 3: IRS2877A 3D ToF LiDAR application block diagram (Source: Infineon)
The Power of Sensor Fusion
Sensor fusion is the process of combining data sources from multiple sensors to create more accurate and reliable information than what can be obtained from any single source alone. It is usually impossible for a single sensor to work independently and provide all the necessary information for the autonomous driving system to make a decision or take any action. The first step for an autonomous vehicle to perceive the world is to capture a wide range of data about the surrounding environment through its array of sensors, including cameras, LiDAR, and radar, so that accurate action instructions can be finally made. Figure 4 graphically shows the process of an autonomous vehicle using sensor fusion to perceive the environment.
Figure 4: The sensor fusion solution consisting of cameras, LiDAR and radar can perceive the vehicle's surrounding environment in 360 degrees (Source: TE)
Each of these sensors has its own unique advantages. LiDAR provides precise distance measurement, and range is a factor in the selection process that determines whether a short-range, medium-range, or long-range LiDAR architecture is best, not just for autonomous driving functions outside the car, but also for many functions inside the car. Radar or millimeter-wave radar is better at detecting the speed and position of objects in all weather conditions. Cameras or cameras can capture rich visual images. The fusion of information from these sensor inputs creates a comprehensive and high-resolution data representation of both inside and outside the car, providing autonomous vehicles with an unparalleled level of situational awareness.
Figure 5: Sensor fusion solutions required in autonomous vehicles (Source: Aptiv)
Previous article:Honda, Nissan and Mitsubishi to form alliance to pool resources to catch up with electric car rivals
Next article:Safety protection at your fingertips: Microchip HOD solutions lead automotive safety technology
- Popular Resources
- Popular amplifiers
- A review of deep learning applications in traffic safety analysis
- Dual Radar: A Dual 4D Radar Multimodal Dataset for Autonomous Driving
- A review of learning-based camera and lidar simulation methods for autonomous driving systems
- Multimodal perception parameterized decision making for autonomous driving
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- Sn-doped CuO nanostructure-based ethanol gas sensor for real-time drunk driving detection in vehicles
- Design considerations for automotive battery wiring harness
- Do you know all the various motors commonly used in automotive electronics?
- What are the functions of the Internet of Vehicles? What are the uses and benefits of the Internet of Vehicles?
- Power Inverter - A critical safety system for electric vehicles
- Analysis of the information security mechanism of AUTOSAR, the automotive embedded software framework
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Sub-library: Summary of official Chinese technical documents on temperature and humidity sensors
- Rainbow Arch designed with Circuit Playground Express
- An interview with Roger Hall, General Manager of Qorvo High Performance Solutions Division, to help you learn more about 5G
- 【ST NUCLEO-H743ZI Review】——by bigbat
- How long have you not spent the Lantern Festival at home?
- Protection processing circuit in IoT system and equivalent circuit of HF RFID reader
- EVAL-M3-TS6-665PN development board circuit structure and features
- 51 MCU falling edge trigger
- TI C6748 chip PRU part assembly software package sprc940.zip
- The new issue of "Analog Dialogue" is online, free download!