Will 4D imaging radar become a substitute for lidar?
Why is the topic of radar systems brought up? Every year, about 1.3 million people die in traffic accidents around the world, and millions more are seriously injured in traffic accidents. The application of radar technology in advanced driver assistance systems ( ADAS ) is crucial, which can effectively avoid traffic accidents and save lives.
New regulations around the world and the development of regional New Car Assessment Program (NCAP) rating standards have accelerated the adoption of radar. For example, many regions have enacted regulations or five-star safety ratings that require vehicles to have features such as automatic emergency braking, blind spot detection, and vulnerable road user detection.
Figure 1: Levels of ADAS and autonomous driving
The Society of Automotive Engineers (SAE) defines six levels of autonomous driving, of which L0 is no automation, and ADAS is gradually evolving into driver assistance, partial automation, conditional autonomous driving, and finally fully autonomous L5 cars. These are driving the popularity of ADAS and gradually improving the degree of automation.
The Leap from L2 to L3
The liability for accidents involving L3 autonomous vehicles is primarily borne by the automaker rather than the driver. While automakers are working to address the design complexity of meeting L3 levels, attention has shifted to transitional levels and advancing their development. In terms of sensor technology, there are significant differences from L2 to L3. L2+ functions similarly to L3, providing a backup option for the driver, reducing the need for additional redundancy.
Figure 2: Market forecast for the transition from L2+ to autonomous driving (2021-2030)
The latest report from Yole Development, a well-known market research and strategic consulting company, shows that as the sales of L0-L2 cars begin to decline, the sales of L2+ cars may steadily increase, reaching a market share of nearly 50% by 2030. L2+ also allows OEMs to gradually introduce advanced safety and comfort features, leaving more time for sensor technology to mature. During this period, the driver continues to play an auxiliary role, while OEMs can optimize the balance between functionality and cost and gradually launch L3 "light" cars.
Sensor technology: no single solution
There are three main sensor technologies for ADAS and autonomous driving: radar, camera, and LiDAR (Light Detection and Ranging). Each technology has its own unique advantages and disadvantages, and in short, there is no mainstream sensor technology solution.
Radar and cameras complement each other to a large extent. Due to their maturity and high cost-effectiveness, they are now widely deployed in L1 and L2 cars. For example, radar performs very well in speed and distance measurement, but cannot capture color information. The angular resolution of traditional radar is much lower than that of cameras and lidar. In contrast, cameras are suitable for pattern and color detection, but are greatly affected by the environment. For example, cameras are less effective in harsh environments such as strong light, night, fog and haze, and rain and snow. On the other hand, RADAR is almost unaffected by adverse weather conditions and can work reliably in strong light and dark conditions.
The main advantages of LiDAR are its ultra-precise horizontal and vertical angular resolution, as well as its fine range resolution. It is therefore suitable for high-resolution 3D environment mapping, and can accurately detect free space, boundaries, and positioning. However, both it and cameras are susceptible to poor weather or road conditions.
For mainstream passenger cars at the L2+ and L3 levels, high costs are the real obstacle. In this regard, 4D imaging radar has a higher resolution than traditional radars. Once it came out, it amazed the world and is becoming a substitute for lidar.
To read the white paper "4D Imaging Radar: Sensor Advantages that Continue to Support L2+ Vehicles", please click here >>
Development of imaging radar
In the early days, radar technology was primarily used to detect other vehicles. Essentially, these were 2D sensors that measured speed and distance. However, today’s advanced radar technology is essentially 4D sensors. In addition to measuring speed and distance, 4D sensors are also able to measure horizontal and vertical angles. This capability allows the vehicle to see cars, and more importantly, pedestrians, bicycles, and smaller objects.
Figure 3: Imaging radar can distinguish between cars, pedestrians, and other objects.
At the lower end (L2+), the focus is on a 360-degree barrier around the vehicle (the industry buzzword is "corner radar"). As the name implies, there are at least four, but typically six or seven, high-resolution radar sensor nodes, as there may be additional "gap-filler" radars mounted on the sides. In low-light conditions, City Autopilot can see a child standing between two parked cars.
For high-end vehicles (L4 and above), the vehicle can see smaller objects and use higher resolution. Imaging radar can fully perceive the environment around the vehicle, detect dangerous objects in a long range in front and behind, and take measures to avoid danger. The detection distance can reach 300 meters or even farther in the future. The highway autopilot can detect a motorcycle coming at high speed behind a truck and respond to it.
The future of imaging radar
The key technology elements driving the development of imaging radar technology are the migration from 24 GHz to 77 GHz, such as gallium arsenide (GaAs) or silicon germanium (SiG) technology to standard pure RF CMOS process. Other developments include advanced MIMO configurations from low channel count to high channel count, from basic processing to high-performance processing using dedicated accelerators and DSP cores, and advanced radar signal processing techniques.
NXP has used these technologies to develop the S32R45 radar processor , which works with the TEF82xx RF-CMOS radar transceiver and is equipped with a certain number of antennas to provide point cloud imaging. This provides a cost-effective solution for OEMs to develop 4D imaging radar functions and optimizes the L2+ automotive industry structure. In addition, there are some essential peripherals such as secure power management and in-vehicle network components. All of these together constitute the radar node, and NXP is positioned to cover the entire system.
To learn more about imaging radar, you can also watch the live demonstration of NXP's 4D imaging radar in the video below to see the technology in action.
Author
Huanyu Gu, Product Marketing and Business Development, NXP Semiconductors. Based in Hamburg, Germany, Huanyu is responsible for product marketing and business development for Advanced Driver Assistance Systems at NXP Semiconductors. Prior to taking charge of the Advanced Driver Assistance Systems business, he has many years of experience in semiconductor marketing and business development for various automotive applications such as secure car access and automotive infotainment systems.