Why is the topic of radar systems brought up? Every year, about 1.3 million people die in traffic accidents around the world, and millions more are seriously injured in traffic accidents. The application of radar technology in advanced driver assistance systems (ADAS) is crucial, which can effectively avoid traffic accidents and save lives.
New regulations around the world and the development of regional New Car Assessment Program (NCAP) rating standards have accelerated the adoption of radar. For example, many regions have enacted regulations or five-star safety ratings that require vehicles to have features such as automatic emergency braking, blind spot detection, and vulnerable road user detection.
ADAS and levels of autonomous driving
The Society of Automotive Engineers (SAE) defines 6 levels of autonomous driving, of which L0 is no automation , and ADAS is gradually evolving into driver assistance, partial automation, conditional autonomous driving, and finally fully autonomous L5 cars. These are driving the popularity of ADAS and gradually improving the degree of automation.
The Leap from L2 to L3
The liability for accidents involving L3 autonomous vehicles is primarily borne by the automaker rather than the driver. While automakers are working to address the design complexity of meeting L3 levels, attention has shifted to transitional levels and advancing their development. In terms of sensor technology, there are significant differences from L2 to L3. L2+ functions similarly to L3, providing a backup option for the driver, reducing the need for additional redundancy.
Market forecast for the transition from L2+ to autonomous driving (2021-2030)
The latest report from Yole Development, a well-known market research and strategic consulting company, shows that as the sales of L0-L2 cars begin to decline, the sales of L2+ cars may grow steadily, reaching a market share of nearly 50% by 2030. L2+ also allows OEMs to gradually introduce advanced safety and comfort features, leaving more time for sensor technology to mature. During this period, the driver continues to play an auxiliary role, while OEMs can optimize the balance between function and cost and gradually launch L3 "light" cars.
Sensor technology – no single solution
There are three main sensor technologies for ADAS and autonomous driving: radar, camera, and LiDAR (Light Detection and Ranging). Each technology has its own unique advantages and disadvantages, and in short, there is no mainstream sensor technology solution.
Radar and cameras complement each other to a large extent. Due to their maturity and high cost-effectiveness, they are now widely deployed in L1 and L2 cars. For example, radar performs very well in speed and distance measurement, but cannot capture color information. The angular resolution of traditional radar is much lower than that of cameras and lidar. In contrast, cameras are suitable for pattern and color detection, but are greatly affected by the environment. For example, cameras are less effective in harsh environments such as strong light, night, fog and haze, and rain and snow. On the other hand, radar is almost unaffected by adverse weather conditions and can work reliably in strong light and dark conditions.
The main advantage of LiDAR is its ultra-precise horizontal and vertical angular resolution, as well as its fine range resolution. Therefore, it is suitable for high-resolution 3D environmental mapping, and can accurately detect free space, boundaries and positioning. However, it is susceptible to bad weather or road conditions, just like cameras. For mainstream passenger cars at L2+ and L3 levels, the high cost is the obstacle. In this regard, 4D imaging radar has a higher resolution than traditional radar, and it has amazed the world as soon as it came out. Under the premise of cost permission, 4D imaging radar, LiDAR and cameras form a system of redundant backup and complementation.
Read the white paper: 4D Imaging Radar: Sensor Advantages That Continue to Support L2+ Vehicles
Development of imaging radar
In the early days, radar technology was primarily used to detect other vehicles. Essentially, these were 2D sensors that measured speed and distance. However, today’s advanced radar technology is essentially 4D sensors. In addition to measuring speed and distance, 4D sensors are also able to measure horizontal and vertical angles. This capability allows the vehicle to see cars, and more importantly, pedestrians, bicycles, and smaller objects.
Imaging radar can distinguish between cars, pedestrians and other objects
For low-end cars (L2+), the focus is on setting up a 360-degree barrier around the vehicle (the industry buzzword is "corner radar"). As the name suggests, there are at least four, but usually six or seven high-resolution radar sensor nodes, because additional "gap-filling" radars may be installed on the sides. In low-light conditions, a city autopilot can see a child standing between two parked cars. For high-end cars (L4 and above), the vehicle can see smaller objects and use higher resolution. Imaging radar can provide comprehensive environmental perception around the vehicle, and can detect hazards at a long range in front and behind and take measures to avoid danger. The detection distance can reach 300 meters or even further in the future. The highway autopilot can detect and respond to a motorcycle coming at high speed behind a truck.
The future of imaging radar
The key technology elements driving the development of imaging radar technology are the migration from 24 GHz to 77 GHz, such as gallium arsenide (GaAs) or silicon germanium (SiG) technology to standard pure RF CMOS process. Other developments include advanced MIMO configurations from low channel count to high channel count, from basic processing to high-performance processing using dedicated accelerators and DSP cores, and advanced radar signal processing techniques.
NXP has used these technologies to develop the S32R45 radar processor, which works with the TEF82xx RF-CMOS radar transceiver and is equipped with a certain number of antennas to provide point cloud imaging. This provides a cost-effective solution for OEMs to develop 4D imaging radar functions and optimizes the L2+ automotive industry structure. In addition, there are some essential peripherals such as secure power management and in-vehicle network components. All of these together constitute the radar node, and NXP's positioning is to cover the entire system.
For more in-depth information on imaging radar, check out NXP’s Imaging Radar Technology Event on February 23-24.
Previous article:Renesas Electronics Launches R-Car V4H for Level 2+/Level 3 Autonomous Driving Functions
Next article:Differentiation has emerged, and the LiDAR market may usher in a window period in the next two years
Recommended ReadingLatest update time:2024-11-16 13:29
- Popular Resources
- Popular amplifiers
- Distributed robust Kalman filter fusion algorithm for ADAS system vision and millimeter wave radar
- End-to-end learning of ADAS vehicle spacing and relative speed based on monocular camera
- A review of deep learning applications in traffic safety analysis
- Dual Radar: A Dual 4D Radar Multimodal Dataset for Autonomous Driving
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- RT-Thread device framework learning RTC device
- Orcad learning notes (three) class and sub class in PCB
- [Hua Diao Experience] 17 Beetle ESP32C3 and WS2812 screen music visualization rhythm light
- EEWORLD University----42/5000 Power over Ethernet (PoE) Training Series
- How to effectively program a microcontroller active buzzer driver
- MSP430 SPI reads AFE4400 register value code
- WPG Live Broadcast Registration | Thundercomm, Lianda, Qualcomm IOT Platform Solutions and Success Stories
- The rain is a surprise in spring, the valley is clear, the summer is full of grains, and the summer heat is connected
- [Anxinke NB-IoT Development Board EC-01F-Kit] 4. Serial port MQTT networking and information sending and receiving test
- Microchip Live FAQ|ADAS Platform Root of Trust