From September 3 to 5, 2021, the 17th China Automotive Industry Development (TEDA) International Forum (hereinafter referred to as TEDA Automotive Forum) was held in Tianjin Binhai New Area, jointly sponsored by China Automotive Technology and Research Center Co., Ltd., China Society of Automotive Engineers, China Association of Automobile Manufacturers, and China Automobile News, with special support from the Tianjin Economic-Technological Development Zone Management Committee, and co-organized by the Japan Automobile Manufacturers Association and the German Association of the Automotive Industry. This forum revolved around the annual theme of "Integration, Innovation, and Green" and focused on hot topics in the industry.
At the "Frontier Outlook: Creating New Smart Car Experiences" on September 5, Wang Yu, manager of radar perception at FAW (Nanjing) Technology Development Co., Ltd., delivered a speech titled "Applications and Challenges of Radar Perception in Intelligent Driving."
Wang Yu, Radar Perception Manager, FAW (Nanjing) Technology Development Co., Ltd.
The following is the transcript of the speech:
Good morning, friends from the industry! I am Wang Yu from FAW (Nanjing). I am very happy that the TEDA Forum has such an opportunity for us to talk with you about some of the things we have done in the past period of time, and also talk about our understanding and challenges of radar perception applications.
Before I start, please allow me to introduce FAW Nanjing. FAW Nanjing is a relatively young company. Everyone knows FAW Group, but not many people have heard of FAW Nanjing. We are indeed a young company and a young team. Our early predecessor was the Artificial Intelligence Research Institute of FAW Intelligent Network Development Institute. Last year, as the group increased its strategic deployment of combining artificial intelligence and automobiles, the Artificial Intelligence Research Institute was moved to Nanjing and the Nanjing Artificial Intelligence R&D Center was established, which is FAW Nanjing. FAW Nanjing is also an important part of FAW Heavy Industry Group's strategic layout. We want to rely on artificial intelligence technology to create a better human travel experience.
At present, within FAW Group, we have also undertaken many self-developed mass-produced vehicle development tasks. At present, the company has more than 100 people and is still expanding rapidly. Although there are more than 100 people in the company, we rely on the strong R&D system of the entire FAW Group. Modeling, design, system, testing, verification, road testing, data collection, all rely on the strong technical support of the headquarters. It is also because of them that we can devote ourselves to software development in Nanjing.
Our company has currently planned six main business lines, namely intelligent perception, intelligent big data, intelligent interaction, intelligent travel, intelligent manufacturing and smart logistics. In the early stage, we will focus on the two main lines of intelligent perception and intelligent big data.
Next, I would like to talk about the application of radar perception in intelligent driving. In fact, there are many important sectors in intelligent driving, but we currently believe that the most challenging and complex field is still the perception field. At present, the mainstream sensor solutions are still dominated by multi-sensor fusion solutions, and we also recognize this approach very much, because different sensors have certain advantages in different scenarios and different traffic conditions, and also have certain technical shortcomings. For example, cameras are easily interfered with under poor lighting conditions. LiDAR can reliably capture the position of traffic targets, but it is easily affected by weather. Millimeter-wave radar sensors have very high accuracy in obtaining vehicle speeds, but the resolution is very limited, and what we can do is still based on target fusion. Therefore, we judge that for a long time in the future, the mainstream perception solution for intelligent driving will still be dominated by multi-sensor fusion technology.
Based on the concept of multi-sensor fusion, FAW Nanjing has created an AI-based autonomous driving solution. In intelligent driving vehicles, a multi-sensor fusion strategy is adopted. First, 8 solid-state laser radars are used on the roof, 12 high-definition cameras are added around the car, and millimeter-wave radars are used as a perception fusion solution. Based on this solution, a full-stack autonomous driving solution is realized. At present, this solution has been upgraded internally, and a new perception solution for the second generation of autonomous driving will be launched soon. Passengers and drivers will have a better experience in terms of sensors and vehicle performance.
Next, let's talk about a very important component of multi-sensor fusion, which is the radar sensor. We know that solid-state radar sensors are a development trend of future lidars. In the early days, we also did a lot of work based on mechanical lidars and spent a lot of energy on solutions based on mechanical radars. We built smart minibuses and smart passenger cars at our headquarters in Changchun. Last year, we were also bold in innovation. Before some other domestic brother companies launched solid-state radar solutions, we had already developed an environmental perception solution based on solid-state radars.
This green car is a solid-state radar test car that we are currently testing on open roads. When we switched from mechanical radar to solid-state radar, there were many entanglements and challenges. For example, how to solve time synchronization, calibration, blind spots and other various problems also troubled us. It may be a small problem for mechanical radar, but it is a brand new challenge for solid-state radar. We spent a lot of effort to solve these problems. For example, in order to solve the problem of time synchronization, when multiple radars collect data at the same time, high-precision time synchronization is required. We have specially built a controller for time synchronization to solve such problems.
In terms of intelligent perception software, we have built a complete set of models and technical frameworks for this intelligent perception system, including some heavyweight offline reference models and lightweight reference models. The heavyweight model is applied to truth values and data annotation, while the lightweight model runs in the domain controller, which has computing resources and high-precision perception capabilities.
Next, we will talk about some data-related work. As a car manufacturer, we have a natural advantage in data accumulation. First of all, relying on the road test vehicle resources of the headquarters, we have built a batch of data collection vehicles. Every day, there will be large-scale data collection and vehicle-wide testing on roads across the country, so the advantages in data collection should be fully utilized. Hundreds of vehicles send back a lot of data every day, and the data is parsed, integrated, cleaned, selected, converted and stored in the cloud. After getting the data, through some pre-conditions set, plus heavyweight models, some automatic labeling is performed. After obtaining the preliminary labeled data, it is manually tested and secondary labeled, and the labeled data set is obtained after correction. Retrain our heavyweight models. After training these models, on the one hand, they are used for semi-automatic labeling in the early stage, and on the other hand, the large-scale models are lightweight and deployed on the vehicle side. Finally, the model on the vehicle side is redeployed to the vehicle, thus forming a closed loop of data collection, data labeling, and data storage. This system is a very valuable tool. It is not only an effective productivity, but also a play of the advantages of car companies, and also our moat.
Next, let's talk about imaging radar. Millimeter-wave radar has its pain points. It can only output limited target data. We can only do target-level fusion based on the data. The resolution is also relatively limited. It can only identify two-dimensional targets, which is not friendly to height information resolution. When imaging radar sensors are available on the market, everything has changed. We also found that imaging millimeter-wave radar is also worth exploring. At present, we have done experimental work based on imaging millimeter-wave radar. For example, at the point cloud level fusion level, we splice the imaging radar and try to fuse it. This is our early attempt at imaging radar millimeter wave, and we are also looking forward to the next generation of imaging millimeter-wave radar to bring us more surprises and become the main sensor in intelligent driving.
The next part will talk about some challenges in the current radar perception field. Here we have summarized four points. The first point is in terms of cost. From some industry data, we can see that the price of LiDAR sensors has dropped from hundreds of thousands of yuan, a very expensive instrument, to 3-5K on mass-produced vehicles. However, for mass-produced vehicles, some high-end models or high-end models, which are iconic sensors, cannot become standard sensors for every vehicle. Because the cost of our entire vehicle is actually very penny-wise, sometimes we need to save every penny, and we are still cautious about expensive sensors that cost thousands of yuan. We estimate that in the next two to three years, our price may drop to 1-2K, so that LiDAR can become a standard sensor for medium-sized vehicles. This is a challenge in terms of cost.
Second, there are challenges in the number and layout of sensors. The price of sensors restricts the selection and number of sensors. When we can deploy some LiDAR sensors on mass-produced vehicles, the location of the deployment is the second challenge. We have also gone through many rounds of discussions with the styling department and the vehicle department. In fact, there have always been some disputes between us. For example, whether the LiDAR is placed on the roof or in the front of the car, in fact, everyone has some advantages and some disadvantages. Placing it on the roof can gain benefits in terms of vision, but there are also considerable challenges in terms of cleaning, styling, and heat dissipation. In the solution of arranging the sensor on the front bumper, it is also easier for us to clean and dissipate heat, but it is easy to block and defile the sensor, which also poses considerable challenges. Finally, we believe that the source of all the layout and selection must ultimately come down to the perspective of function realization. If we define what kind of perception capabilities are required for the final autonomous driving scenario, and have the perception capabilities, we can then derive how far and how clear we need to see, and then derive the location of the deployment. In fact, we still have to adhere to such deployment principles.
Previous article:Velodyne to Demonstrate Advanced LiDAR Technology at IAA Mobility
Next article:In-depth analysis of Tesla, Qualcomm, and Huawei AI processors
- Popular Resources
- Popular amplifiers
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- How much do you know about intelligent driving domain control: low-end and mid-end models are accelerating their introduction, with integrated driving and parking solutions accounting for the majority
- Foresight Launches Six Advanced Stereo Sensor Suite to Revolutionize Industrial and Automotive 3D Perception
- OPTIMA launches new ORANGETOP QH6 lithium battery to adapt to extreme temperature conditions
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions
- TDK launches second generation 6-axis IMU for automotive safety applications
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- CGD and Qorvo to jointly revolutionize motor control solutions
- CGD and Qorvo to jointly revolutionize motor control solutions
- Keysight Technologies FieldFox handheld analyzer with VDI spread spectrum module to achieve millimeter wave analysis function
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- Disable AD auto-start JLink
- Seeking guidance - stc microcontroller remote upgrade program
- Problems with creating sheet symbols for multi-page schematics
- Zigbee Z-Stack 3.0.1 Modify channels using broadcasting
- SHT31 Review + My Review Summary
- Using AT89S series microcontroller
- 【McQueen Trial】Use IPAD to program McQueen's car
- The STM32 FFT library calculates the amplitude normally, but the phase is different each time. Has anyone encountered this problem?
- EEWORLD University----UCD3138 Analog Front End (AFE) Module
- "Show goods" to come to a wave of commonly used development boards