Can domestically produced chips for smart cars break through technological barriers?

Publisher:Blissful444Latest update time:2023-10-24 Source: elecfans Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

03

The configuration of autonomous driving sensors and the pre-embedded hardware have become the mainstream strategy of car companies.
Before 2015, the auxiliary driving functions were mainly L1/L0. L1 can realize acceleration and deceleration or steering control. The driver continues to operate the vehicle horizontally and longitudinally. Representative functions are LKA, AEB, etc.;
In 2016, it entered the L2 era, which can realize the automation of vehicle speed and steering at the same time. The driver must always maintain control of the driving. In specific scenarios, the system performs lateral and longitudinal operations. Representative functions are ACC, LKA, APA, etc. Some ECUs have begun to develop in an integrated manner, but there is still no domain division. At present, the overall single-vehicle supporting value of L2 and below is about 15,000 yuan.
By 2020, it will officially enter the L3 level introduction period, which is conditional autonomous driving and can free your hands. The driver does not have to monitor the system all the time, but must always be vigilant and intervene when necessary. The whole vehicle is divided into about 5~6 domains, the computing power of the controller is increased exponentially, and Ethernet begins to appear. The overall single-vehicle supporting value of L3 and below is about 25,000 yuan. However, due to the imperfection of L3 level autonomous driving technology, regulatory issues, responsibility identification, and high costs, the ADAS at the L2 level will still be the main type that can achieve large-scale mass production in the short term.
In the future, L2+ assisted driving systems will be rapidly popularized and exist for a long time. In addition, L4 autonomous driving has begun to be implemented in some low-speed and closed scenarios such as mines and ports. However, based on laws and regulations and technical maturity issues, it will take a long time for robotaxi/robotruck to be implemented in high-speed and open scenarios. Black Sesame Smart Dan Jizhang pointed out that it is a long process to truly break through from L2 to L3. For a long time in the future, intelligent networked vehicles will still be in a state of human-machine co-driving, which involves the close cooperation and upgrading of software, hardware, data and other technologies. In particular, high-computing power automotive-grade chips will be the core key to breakthroughs in high-level autonomous driving.
At present, there are three mainstream autonomous driving chip SOC architecture solutions on the market: CPU+GPU+ASIC, CPU+ASIC and CPU+FPGA. From the development trend, customized mass-produced low-power, low-cost dedicated autonomous driving AI chips (ASICs) will gradually replace high-power GPUs.
Nvidia's Xaiver chip mainly has four modules, of which the GPU occupies the largest area, followed by the CPU, supplemented by two ASICs. Tesla's FSD chip architecture mainly has three modules, namely GPU, CPU and NPU, of which NPU is the focus of the architecture.
MobileyeEQ5's CVP is an ASIC designed for Mobileye's own visual algorithm, which effectively reduces power consumption. Horizon independently developed an ASIC chip based on a flexible BPU architecture.
Google Waymo adopts the "CPU+FPGA" solution, and its computing platform uses Intel Xeon's 12-core CPU or above, with Alter's Arria series FPGA. Its I/O Board uses Infineon's Aurix series MCU as the CAN/Flex Ray network communication interface. After the autonomous driving algorithm is solidified, FPGA may be replaced by ASIC.
ISP (Image Signal Processing) refers to the image signal processor, which mainly processes the signal output by the front-end image sensor. Simply put, ISP is the Photo Shop of the camera, and its purpose is to improve image quality. In traditional autonomous driving solutions, there is a one-to-one correspondence between ISP and camera, that is, as long as there is a camera, there must be an ISP.
On the vehicle side, the integration of ISP inside the SOC means that there is no need to provide ISP for each camera sensor, which greatly reduces the cost of perception hardware. On the camera side, the cancellation of ISP can not only solve the serious heat dissipation problem caused by high-pixel cameras, but also help the vehicle-mounted cameras to further reduce the size of the circuit board and reduce power consumption.
ISP is integrated in both NVIDIA's Xavier and Black Sesame Smart A1000 chips. According to NVIDIA's official website, the built-in ISP of NVIDIA Xavier can process 1.5 billion pixels per second, and Black Sesame Smart has also integrated ISP into the A1000 chip, which can process 1.2 billion pixels per second.
The computing power limit of the on-board computing platform determines the upper limit of the software service upgrade that can be carried during the life cycle of the vehicle. The vehicle manufacturer will complete the closed loop of the business model by charging software licensing and OTA update service fees to the C-end. At present, the overall intelligent driving system of mass-produced passenger cars is at the L3 level or below, but the intelligent driving technology is still in the process of continuous iteration and upgrading. In order to ensure the continuous software upgrade capability of the vehicle throughout its life cycle, the OEM adopts the strategy of "hardware pre-installation, software upgrade" in intelligent driving, and provides sufficient development space for subsequent software and algorithm upgrade optimization by pre-installing high-computing power chips.
High-level autonomous driving has higher requirements for camera pixels, and it is expected that 8-megapixel cameras will replace the mainstream solution of 1-2 megapixel cameras in the future. Assuming that a smart car is equipped with 12 8-megapixel cameras at 60 frames per second (FPS), the implicit data input rate may reach 576 million pixels/second.
With the addition of the point cloud algorithm of LiDAR, it is expected that the computing power of smart cars will increase from more than 100 TOPS (trillion operations per second, a measure of computing power) of the current neural network processor (NPU) supporting L2+/3 autonomous driving and 80K DMIPS (Dhrystone million instructions per second, also a measure of computer performance) of the CPU supporting 2+/3 autonomous driving to more than 1,000 TOPS and 500K DMIPS in 2030, respectively.
With the continuous increase in the penetration rate of ADAS assisted driving functions in the new car market, the competition between new forces and leading independent brand car companies in the field of intelligent driving is becoming increasingly fierce, and the configuration of intelligent driving sensors is moving towards "involution". Models represented by Weilai, Xiaopeng, and Jihu are the first to announce the mass production of LiDAR on cars. In addition, the requirements for camera pixels for high-level autonomous driving are increasing, and the computing power of corresponding autonomous driving chips is also continuously improving.
Autonomous driving data sets are crucial for training deep learning models and improving algorithm reliability. SOC manufacturers have not only launched self-developed AI training chips, but also cloud supercomputing platforms. Tesla has launched the AI ​​training chip D1 and the "Dojo" supercomputing platform, which will be used to train Tesla's autonomous driving neural network. Not only that, training algorithm model products are becoming increasingly important, including 2D annotation, 3D point cloud annotation, 2D/3D fusion annotation, semantic segmentation, target tracking, etc., such as NVIDIA Drive Sim autonomous driving simulation platform, Horizon "Eddie" data closed-loop training platform, etc.
Tesla has launched the Dojo supercomputing training platform: using Tesla's self-developed 7nm AI training chip D1, relying on a large customer base to collect autonomous driving data, so as to realize model training for deep learning systems. According to official public information, Tesla's Dojo AI system adopts a distributed architecture, and each Dojo node has its own CPU, memory and communication interface. Each node has 1.25MB of SRAM (static random access memory), and then each node is connected to a 2D grid.
Currently, Tesla Autopilot mainly uses 2D images + annotations for training and algorithm iteration. Through the Dojo supercomputing platform, Autopilot can be trained in the form of 3D images + timestamps (4DAutopilot system). The 4DAutopilot system will have predictability and mark the 3D movement trajectory of road objects to enhance the reliability of autonomous driving functions.
NVIDIA has launched an autonomous driving simulation platform: DRIVE Sim is a simulation tool built on Omniverse that can take advantage of many features of the platform. The data generated by DRIVE Sim is used to train the deep neural network that constitutes the perception system of autonomous vehicles. DRIVE Sim's sensor functions include path tracking cameras, radars, and lidar models that can capture real-world effects such as motion blur, LED flashing, rolling shutter, and Doppler effect.
Horizon Eddie Platform: Eddie AI Development Tool Platform is an efficient software 2.0 training, testing, and management tool platform, including semi-/fully automatic annotation tools, automated model training, long-tail scenario management, automatic software integration, and automated regression testing. Finally, this entire set of models is deployed on the chip through OTA upgrades.
Huawei Octopus Autonomous Driving Open Platform: Octopus is an on-demand full-stack cloud platform that covers the entire life cycle of autonomous driving data, models, training, simulation, and annotation. It provides three major services to automakers and developers, including data services, training services, and simulation services.
(1) Data services: Process sensor data output from the vehicle hardware platform, and replay data in different formats such as radar and cameras; support PB-level massive storage, interactive big data query, and massive data governance.
(2) Training services: Manage and train autonomous driving models, continuously improve the accuracy of models on new data sets and test sets, and continuously improve the safety factor of autonomous driving. The platform provides software and hardware acceleration, which can significantly shorten training time and improve training efficiency.
(3) Simulation services: Provide application tools such as simulation, scenario library management, scenario clips, and evaluation systems to ensure that autonomous driving models are compliant, safe, measurable, and meet quality standards, and are quickly integrated into versions. The world's leading autonomous driving AI training chips include: Intel Ponte Vecchio, NVIDIA A100, Tesla D1, etc.
OTA technology was first used on PCs, and later widely used in the mobile phone industry. It has only been widely used in the automotive industry in recent years. OTA is an over-the-air download technology, which means downloading new software update packages from a remote server over the network to upgrade the system, including firmware upgrades and application upgrades, so as to meet the application management needs of terminal manufacturers and the management requirements of operators for network terminals.
Through OTA technology, car companies can perform remote vehicle diagnosis, big data and other applications, quickly repair system failures, and add new functions, etc., so that even if the car has left the factory and is in service, it can be remotely upgraded through the Internet to achieve the purpose of "function updates or vulnerability remediation".
As of June 2021, ADAS algorithms have the most cruise-related upgrades, which are 42 items. The upgrades mainly include ACC/ATC, active circulation, and speed assistance. The second is the addition or optimization of warning functions, including collision warning, door opening warning, and lane departure warning, etc., with a total of 23 items. In addition, there are 23 optimizations or additions to the parking system. There are 17 items related to target detection and recognition, mainly including optimization of road object or animal recognition, traffic sign recognition, etc. In addition, the optimization and upgrade of the surround view system and the lane keeping system have been upgraded by 14 and 12 items respectively.
OTA upgrades change the business model of the entire automotive industry. OEMs can send "algorithm update packages" to C-end customers to realize the charging of the entire life cycle of the car, rather than the "one-time deal" in the traditional car era. In the past, the traditional automotive industry has long relied on the manufacture and sales of new cars to make profits. Now in the era of smart cars, OTA can be charged according to "software upgrade × car ownership".
▲Some OEM upgrades and charging situations
With the advancement of automobile electrification and intelligence and the increase in the penetration rate of autonomous driving, the autonomous driving chip industry will maintain a relatively high growth rate. It is estimated that the market size of China's autonomous driving chips will reach 13.8 billion yuan in 2025 and 28.9 billion yuan in 2030, with a ten-year compound growth rate of 25.1%.

[1] [2] [3] [4] [5] [6]
Reference address:Can domestically produced chips for smart cars break through technological barriers?

Previous article:Introduction to new features of vehicle dynamics and scenario modeling software DYNA4
Next article:Intelligent power generation system in the car

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号