Autonomous driving chip, the right one is the best

Publisher:喜茶我要七分糖Latest update time:2022-11-24 Source: 车东西 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

The smart car industry has developed rapidly in the past year or two, and various smart driving and smart cockpit systems have been put on cars. Intelligent systems require chips to provide computing power, so chip giants such as Qualcomm, AMD, and Nvidia have also become regular guests at new car launches.


The trend in the smart driving chip market is very obvious. New smart cars such as the Lideal L9, NIO ET7, WM M7, Zhiji L7, Feifan R7, and Xpeng G9 all use NVIDIA's Orin chips.


This near-monopoly market trend has made this SoC chip with GPU as the main AI accelerator a popular player in the automotive chip market.


But when it comes to smart driving, especially urban L2, is NVIDIA really the only choice for chips? How does the Qualcomm chip, which has become a huge fan in the field of smart cockpits, perform in smart driving?


Today, Chexixi will answer this question based on the current development trend of intelligent driving systems.


1. L2 driving into the city requires a heavy-sensing solution


This year and next, the development of intelligent driving will focus on end-to-end high-end intelligent assisted driving in urban scenarios, which is generally referred to as urban L2 or urban NOA in the industry.


Compared with relatively closed highway and urban expressway scenarios, the complexity of urban scenarios has increased exponentially. There are not only more intersections and traffic lights, but also a large number of pedestrians, electric vehicles, bicycles and other traffic participants. There are also problems with trees and buildings blocking the view.


Urban L2 faces multiple challenges


At the same time, the high-precision maps that high-end intelligent assisted driving systems rely on are far from covering a wide range of cities, and urban roads are often subject to various constructions, diversions, etc., and the update of high-precision maps also faces great challenges. challenge.


To solve the above two difficulties, the most feasible way in the intelligent driving industry is to use a "sense-heavy, map-light" solution. The typical representative here is the urban NOH system jointly created by Qualcomm, Weipai and Haomo Zhixing.


 The system was first installed on the LiDAR version of Wei Brand’s  Mocha DHT-PHEV  (  parameter  |  inquiry ). Note here that after this car is launched, users can directly use the urban NOH system after purchasing it. This is not a futures contract.


Mocha DHT-PHEV lidar version


Let’s talk about the technology itself. In order to cope with complex urban scenes, the car is equipped with 2 125-line lidars, 4 8-megapixel high-definition cameras, 8 ordinary cameras, and 5 millimeter-wave radars. Its sensing hardware far exceeds ordinary high-speed L2 models. .


Mocha DHT-PHEV lidar version sensor layout


This urban NOH system based on Qualcomm's Snapdragon Ride platform abandons HD high-precision maps and instead uses an enhanced version of SD standard-precision maps. Although the map accuracy is at the decimeter level, the vehicle's self-positioning accuracy can be improved through perceptual comparison of stationary objects, thereby achieving the effect of a high-precision map.


It can also be seen from the previous introduction that powerful perception hardware and SD map solutions must have powerful perception algorithms to work, and powerful perception algorithms naturally require powerful smart driving chips.


2. BEV is a key technology and the demand for computing power is surging.


Whether it is the urban NOH solution jointly created by Qualcomm, Weipai, and Haomo Zhixing, or urban smart driving solutions such as Tesla's FSD and Ideal AD Max, they all chose the BEV+Transformer technology path.


Simply put, this technique has three steps. The first step is to preprocess the raw sensory data of each camera to obtain some feature data. Then the feature data is thrown into the Transformer multi-task model to output a complete and unified perception result from the BEV perspective, including lane lines, other obstacle vehicles, and target results such as traffic lights.


Finally, based on the location and driving trajectory of these target results, the driving strategy of the vehicle can be calculated.


Because the perception results are presented uniformly under the BEV's bird's-eye view, the relative positions of stationary and dynamic objects are very clear, making it easier to make better driving decisions.


If we use the traditional method of independently sensing the target by each camera and then integrating it later, it will be easy to miss or wrongly detect the obstacle vehicle in the cross-camera scene. This corresponds to the intelligent driving system facing Close Cut in driving. The performance in the in (close jam) scene was poor - either it didn't sense it and hit it directly, or it didn't sense it until the last moment and suddenly braked hard. Terrible experience!


Judging from a road test video recently released by Qualcomm and the actual driving experience of Che Dongxi, the NOH system running on the Qualcomm Snapdragon Ride platform performs quite well.


On the one hand, it breaks through the scene restrictions of highways, urban expressways and urban open roads, and can be used in every scene. On the other hand, the system can also handle complex urban scenarios such as waiting at red lights, passing intersections, yielding to pedestrians, passing roundabouts, and unprotected left turns, and truly has the ability to realize end-to-end high-end intelligent assisted driving in the city. .


Urban NOH road test


The Weipai Mocha DHT-PHEV lidar version will make the urban NOH system jointly created by Qualcomm, Weipai and Haomo Zhixing one of the first urban high-end intelligent assisted driving systems in the world to achieve mass production.


Back to the intelligent driving system itself. Although the technical architecture of BEV+Transformer is better than traditional smart driving solutions, it will obviously bring new challenges.


Compared with the previous L2 solution of single camera + single millimeter wave radar, the new technical architecture not only needs to process the raw data of all cameras separately, but also runs a multi-parameter large model such as Transformer to output multiple perception results. .


The computing power requirement is directly increased by hundreds of times. It is for this reason that we have seen that the new smart cars released in the past year or two, without exception, emphasize that they are equipped with on-board chips that often have hundreds or even thousands of TOPS AI computing power.


Some people say that car companies are forced to use computing power for a marketing gimmick, but this is really a bad idea. As mentioned before, once you understand the fundamental technological changes taking place in intelligent driving systems, you will know that the more computing power, the better.


3. The right smart driving chip is the best


Then the question arises again. Should we choose the popular Nvidia Orin for the smart driving chip with large computing power, or should we choose other platforms such as Qualcomm Snapdragon Ride?


This has to be looked at from a technical point of view. Autonomous driving is a highly customized embedded system that integrates software and hardware. Only when software and hardware achieve the best fit can the best results be achieved at the system level.


But the reality is that the general direction of technology of each company is the same, but the implementation details are very different. For example, the neural network and Transformer models that preprocess the raw camera data are different from each company.


After choosing the Qualcomm Snapdragon Ride computing platform, Weipai and Haomo Zhixing also created an excellent urban NOH system. It just goes to show that choosing a smart driving chip is like buying clothes. There is no need to deliberately pursue best-selling models. The best one suits you.


Qualcomm Snapdragon Ride computing platform consists of SoC chips, AI accelerators, supporting tool chains, and one-stop vision software stack.


The computing platform equipped with Weipai Mocha DHT-PHEV lidar version consists of 5nm process Snapdragon Ride SoC and 7nm AI acceleration chip to provide computing power for the intelligent driving system. The AI ​​reasoning capability of the entire controller has reached 360TOS, surpassing Tesla The FSD computing platform (144TOPS) also exceeds the controller composed of a single NVIDIA Orin SoC (254TOPS).


These SoCs are equipped with multiple cores such as high-performance CPUs for planning and decision-making, cutting-edge GPUs to support high-end visualization and immersive user experience, ISPs for camera sensors, enhanced DSPs for sensor signal processing, and security processors. It supports a variety of AI neural networks and operators including Transformer, scalar, vector, matrix, etc. for acceleration, which is very suitable for the technology path of heavy perception.


Qualcomm Snapdragon Ride Flex SoC structure diagram


Qualcomm's Snapdragon Ride platform also has a trick up its sleeve.


If you feel that the 360TOPS computing power is not suitable, you can also freely choose from the different SoC and accelerator product matrices of Qualcomm Ride Platform, and flexibly combine them into computing platforms with different computing power to meet the requirements from ACC to HWA to urban NOH, or from Various development needs for L1 to L4 autonomous driving.


Qualcomm Snapdragon Ride Flex SoC can be flexibly combined with AI accelerators


This kind of flexibility in choosing different SoCs and accelerators is impossible to find anywhere else at the moment. This can effectively reduce the R&D costs of car companies and Tier 1, reduce repeated development work, and launch mass-produced models faster.


Qualcomm, as one of the world's largest technology and chip giants, is worried about allowing car companies and Tier 1 to make good use of its Ride platform. They provide developers with a wealth of supporting software, including security middleware, operating systems and driver software stacks.

[1] [2]
Reference address:Autonomous driving chip, the right one is the best

Previous article:Let’s talk about the design, start-up and working principle of the recently popular hyper-heterogeneous chip——taking TDA4 chip as an example
Next article:NavInfo’s automotive grade MCU chip AC7802x was successfully lit up in one go

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号