Why is DPU called the "third main chip" in the data center?

Publisher:EE小广播Latest update time:2022-10-17 Source: EEWORLD Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Why is DPU called the "third main chip" in the data center?


From October 2021 to the present, the Institute of Computing Technology of the Chinese Academy of Sciences has successively released the "Dedicated Data Processor (DPU) Technology White Paper" and "Dedicated Data Processor (DPU) Performance Benchmark Evaluation Method and Implementation" for the entire industry. The two white papers explain in a relatively comprehensive manner why DPU is defined as a disruptive technology for the development of data centers.


In the DPU performance benchmark evaluation white paper, the evaluation and implementation methods of DPU in the four fields of network, storage, computing and information security are explained. These are also the four major areas where DPU releases its potential, and they are also the key links of data centers. Therefore, DPU, CPU, and GPU are called the three major drivers of the computing power economy era, and DPU has become the "third main chip" in the data center field. In addition, intelligent driving, data communication, network security, etc. are also target application areas of DPU. According to estimates by CCIDnet, the global DPU market size will exceed 10 billion US dollars starting in 2023, with subsequent annual growth rates exceeding 50%.


image.png

Image source: CCID.com


This is why NVIDIA CEO Jensen Huang emphasized when launching the company's first DPU product, "This is just the beginning." Currently, not only industry giants such as NVIDIA are keeping a close eye on the DPU track, but start-ups such as Inspur Microelectronics have also stood out, demonstrating extraordinary technical strength, and their products have been recognized by the industry.


What is DPU?


If you want to know why DPU is hot, you must first understand what DPU is.


DPU stands for Data Processing Unit, which is also called Data Center Processor by some practitioners. DPU is data-centric and its core function is to accelerate data processing, helping the system solve the problem of low CPU processing efficiency and GPU inability to handle loads.


According to the DPU technology white paper, the data processed by DPU is different from the "signals" processed by traditional DSPs, baseband processors and other processors, as well as the graphic image data processed by GPUs. The data processed by DPU is various digital information that is closer to the underlying architecture, especially various time-series and structured data, such as large structured tables, data packets in network flows, massive amounts of text, etc.


image.png

DPU Architecture Reference Design

Image source: DPU Technology White Paper


DPU has great potential


It is not difficult to find that DPU is proposed to solve the bottleneck of data center development and is the biggest beneficiary of the heterogeneous computing of data center servers. With the vigorous development of the data center market, DPU has great potential in this field.


Taking the Chinese market as an example, the first wave of data centers kicked off in 2014-2015, mainly serving the rapidly developing Internet at that time. At that time, many data exchanges emerged in China. From the specific characteristics, the first wave of data centers was centered on computing, but soon the CPU as the main computing power platform encountered bottlenecks, and the CPU generally had 20%-30% computing power resource waste in resource management, which brought a sense of powerlessness in improving computing power.


The second wave of data centers began in 2020, which is what people call the "first year of DPU". The most obvious change is that data centers no longer emphasize the concepts of computing centers and computing power centers, but return to the essence of data-centricity. In this process, the types of clouds and application algorithms are becoming more and more diverse. Improving data processing efficiency is of course the main goal, but reducing the burden on the CPU to enhance system scheduling capabilities is also a general direction.


The shift from "computing-centric" to "data-centric" is a huge conceptual shift, and DPU stands out from many coprocessors. As a dedicated engine for data processing, the ever-increasing types of DPU products in the future will allow the CPU to return to its own areas of expertise - coordinating the overall situation and task scheduling, releasing the CPU's computing power to upper-level applications, and the DPU will become a new system gateway, storage entry, algorithm acceleration sandbox, and security engine.


As can be seen from the figure below, DPU is ubiquitous in the data-centric data center framework.


image.png

Traditional cloud computing infrastructure versus data-centric data center architecture

Image source: DPU Technology White Paper


Of course, DPU needs to overcome some challenges to realize its potential in the data center. First, at the hardware level, the DPU product variety needs to be rich enough. As mentioned above, the CPU needs to be fully unloaded in the four dimensions of network, storage, computing and information security. At the same time, DPU products from different manufacturers need to follow common standards and reflect the total cost of ownership advantage in the overall solution. At the software level, the current CPU and GPU applications in data centers have a standard software framework, which realizes the decoupling of software and hardware. After the DPU is added, this is also a long-term and challenging task.


As mentioned above, the terminal application scenarios of DPU are not limited to data centers. DPU also has broad application prospects in the fields of intelligent driving, data communication, network security, etc.


For example, in the field of communications, network function virtualization based on DPU can be applied to 5G edge computing UPF to enhance the high-speed and large-bandwidth capabilities of 5G communications in vertical industries; in the field of network security, DPU can help build a hardware root of trust, and can achieve isolated network virtualization in a completely unloaded manner, isolating it from the host operating system to protect users from hacker attacks; in the field of autonomous driving, the various virtualization functions provided by DPU can better help developers realize "software-defined cars."


Domestic DPU helps China's "new infrastructure"


DPU has opened up a big industrialization trend. All fields in my country's "new infrastructure" strategy will benefit first and accelerate digital transformation. At the same time, "new infrastructure" and "Eastern Data and Western Computing" will also bring strong demand in terms of data and computing power. Coupled with the country's grand layout in industries such as 5G and autonomous driving, the future development prospects of domestic DPU are bright.


Seizing the Chinese core opportunities in the DPU industry, Jingxin Microelectronics Technology (Tianjin) Co., Ltd. (hereinafter referred to as Jingxin Micro), although still a young startup, has reached the domestic leading level in DPU technology research and development and product commercialization progress.


image.png


Jingxinwei was established in Tianjin Economic and Technological Development Zone in 2020. It focuses on innovative tracks such as new computing, new networks, and new security, bringing leading DPU products and technical services to target customers in key industries such as party, government, military, finance, energy, transportation, electricity, and telecommunications.


image.png


According to the DPU technology white paper, the DPU architecture has three core components, namely the control plane, IO subsystem and data plane. In terms of IO subsystem, Jingxin Micro provides original technologies such as RapidIO embedded system interconnection and software-defined interconnection switching, and has successfully developed RapidIO switching chip NRS1800, software-defined interconnection switching chip SDI3210, intrinsic security switching chip ESW5610, bridge chip PRB0400 and other system IO products.


In the data plane, Jingxin Micro has released a network processing smart chip with an integrated AI engine, Jingxin 2820, which is mainly aimed at cloud computing and data center network applications and related task offloading, and provides intelligent network monitoring and management capabilities. The processing of network packets is the first scene for DPU to be used. It uses the "network + computing power" model to offload the CPU, so DPU is often regarded as an upgraded version of SmartNIC (smart network card), but we all know that the potential of DPU is far more than that.


Jingxin 2820 realizes the message parsing of 2-7 layers and the acceleration capability for key applications such as cloud computing and industrial Internet. At the same time, the chip has the ability to be upgraded and expanded, can cope with the rapidly changing requirements of big data processing and delay-sensitive applications, and has the ability to be deployed quickly.


Jingxin 2820 supports three major application scenarios:


Standard NIC mode In this mode, the chip works in a standard NIC mode and can be configured as a 2*100GE standard NIC, 4*10GE NIC, 2*25GE or 2*10GE NIC, supporting up to 16*PCIE4.0 and SR-IOV data transmission of multiple virtual machines. The software accesses the network interface card through the standard Linux NIC driver.


Smart NIC mode In this mode, the CPU and AI engine participate in data processing, and the chip works in intelligent unloading mode. Data packets are sent through two paths: one is the control path, related protocol packets, and control packets are forwarded to the CPU core for processing various protocol control planes such as RDMA, OVS, etc.; the other is the data path, which is used to process packet encapsulation/decapsulation, data processing acceleration, etc. The AI ​​engine extracts and analyzes input data in real time, detects and identifies whether the data traffic has malicious attack behavior, and reports and alarms.

[1] [2]
Reference address:Why is DPU called the "third main chip" in the data center?

Previous article:DxO wins Best Imaging Software award for DxO PhotoLab for third consecutive year
Next article:Lenovo’s Global IT Organization Accelerates HR Digital Transformation with UiPath Automation

Recommended ReadingLatest update time:2024-11-16 19:40

Is a new battle starting in the attempt to compete with CPU and GPU in terms of high DPU?
Although it is a new thing, the "famous" DPU battlefield is already filled with smoke. Since Nvidia named its Mellanox-based SmartNIC card "DPU" last October, the concept has become popular. Although the DPU concept was first proposed by the startup Fungible, Nvidia took over and carried it forward. In less than a yea
[Mobile phone portable]
Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号