Challenging Nvidia, AMD is becoming an AI chip company
Latest update time:2024-09-01
Reads:
Image source | Stills from Douban Movie "Student Zhu Lost His Superpowers in the Third Grade"
Author | Zhang Lianyi
Source | Cybertruck
It doubled in one year.
At the end of July, chip giant AMD (Advanced Microelectronics) released its second quarter financial report for 2024. Among them,
the data center business grew rapidly, with net income of US$2.8 billion in the second quarter, a year-on-year increase of 115%, a record high, accounting for nearly 48% of total revenue.
This is mainly due to one chip: AMD Instinct MI300.
At the end of 2023, AMD released the new generation of AI/HPC dedicated accelerator Instinct MI300 series, including the MI300X with pure GPU design and the MI300A with CPU+GPU fusion design, which fully competes with NVIDIA's H100 series.
Since its launch, MI300 sales have grown rapidly thanks to the surge in demand for AI computing power.
Its revenue in the second quarter of this year exceeded US$1 billion, making it AMD's fastest growing product in history
.
According to AMD Chairman and CEO Lisa Su,
more than 100 companies and AI customers are actively deploying MI300X
. Many OEMs will put MI300X systems into mass production, including Dell, HP Enterprise, Lenovo, and AMD; and many cloud service providers, including Microsoft and Oracle, are also increasing the adoption and deployment of MI300X chips.
She expects
AMD's artificial intelligence chip revenue to exceed $4.5 billion in 2024, up from an April estimate of $4 billion
.
Before the earnings report was released, AMD's stock price continued to decline, approaching its lowest level in seven months. With the release of the earnings report, AMD's stock price rose by more than 7% after the market closed. It even led to a rebound in chip stocks.
Data center revenue of the three largest companies (US$100 million). Data source: Caitong Securities
AMD's data center business is still far behind Nvidia's, with the former generating $2.8 billion in a quarter and the latter generating $22.6 billion.
But investors seem to believe that AMD is gradually becoming stronger and has the ability to grab more market share from Nvidia. In the AI era, the business of "selling shovels" is not only for Nvidia, but also for AMD
.
01
MI300 chip grabs market share with its cost-effectiveness
There is no doubt that NVIDIA is the absolute king of the global data center-level AI chip market and occupies a dominant position.
According to a study by semiconductor analysis firm TechInsights,
Nvidia's data center GPU shipments in 2023 will be approximately 3.76 million units, accounting for 97.7% of the market share, similar to its market share in 2022
.
AMD and Intel ranked second and third with shipments of 500,000 and 400,000 units respectively, with market shares of 1.3% and 1%
respectively
.
Data center GPU market share of the three major companies. Data source: Caitong Securities
It is obviously difficult to shake Nvidia's position.
For AMD, the top priority is not to surpass Nvidia, but to win more orders.
On the one hand, as one of Nvidia's few competitors, AMD has always been regarded by the technology giant as "another option
.
"
Technology giants want to break Nvidia's monopoly and diversify their hardware supply, which will support AMD's development to a certain extent, with Microsoft being a representative example. Many years ago, Microsoft established the MI50 and MI100 clusters, using ROCm on AMD GPUs to optimize the training and reasoning of large models.
Su Zifeng specifically introduced in the conference call that
Microsoft has increased its use of MI300 chips to support the computing power of GPT-4 Turbo and to support multiple Copilot services such as Microsoft Word and Teams
. Hugging Face is one of the first customers to adopt the new Microsoft Cloud Azure instance, enabling enterprises and AI customers to deploy hundreds of thousands of models on MI300 GPUs with a single click.
On the other hand,
AMD products are also more cost-effective
.
A big feature of Nvidia's products is that due to its leading technology and high market share, its premium is also very high. Reuters once reported that
Nvidia H100 costs about $3,320 and sells for $30,000, with a profit margin of up to 1,000%
.
According to a report released by Citibank,
the price of H100 is more than four times that of MI300X
.
But judging from the "paper parameters", the MI300 series has surpassed the H100.
Comparison of mainstream chips in the market
In terms of performance, MI300A provides 61TFLOPS and 122TFLOPS of FP64 and FP32 computing power respectively, which is 4 times and 2 times higher than H100 respectively.
In terms of memory, the MI300A is equipped with 128GB of HBM3 memory, while the MI300X is equipped with a larger 192GB of HBM3 memory. In comparison, the H100 has a smaller memory capacity. The increase in memory can improve the efficiency of model training and reasoning.
In terms of memory bandwidth, MI300A and MI300X have 5.3TB/s and 5.2TB/s memory bandwidth respectively, which is much higher than the memory bandwidth of H100. Higher memory bandwidth can speed up data transmission and improve computing efficiency.
In specific test scenarios, according to AMD, the MI300 series also performs well
. For example, in a 1v1 comparison, the MI300 trains the Llama 2 70B model 20% faster than the H100; and the FlashAttention 2 model is 20% faster than the H100. In terms of training performance, the MI300X is comparable to the H100.
Judging from the above data,
AMD is expected to seize a certain market share in the AI chip market through its cost-effectiveness advantage
.
Based on this,
Su Zifeng also raised her forecast for AMD's artificial intelligence chip revenue in 2024 to over US$4.5 billion, higher than the US$4 billion estimated in April
.
02
AMD also plans to release new AI chips every year
It should be noted that NVIDIA is still coming up with better products.
In March of this year, at the GTC keynote speech,
Nvidia founder Jensen Huang announced the launch of the next-generation AI chip architecture Blackwell
.
Blackwell Performance Parameters
He said that Blackwell has 208 billion transistors, more than twice the 80 billion transistors of the previous generation chip "Hopper", and can support AI models with up to 10 trillion parameters. "It will become the cornerstone of new computers and other products deployed by the world's largest data center operators such as Amazon, Microsoft, Google, and Oracle."
The first chip to adopt the Blackwell architecture is called GB200. Huang Renxun called it "the most powerful AI chip in history".
Compared with the previous generation Hopper H100, it has significantly improved performance while greatly reducing energy consumption and costs. According to the plan, it will be initially produced in the third fiscal quarter of this year and will be shipped on a large scale in the fourth fiscal quarter
.
If AMD wants to further expand its market share, it must rely on MI350 to withstand the pressure from Nvidia's Blackwell series and subsequent rounds of shocks.
Judging from Su Zifeng’s statement and AMD’s recent actions, it is actively responding.
Similar to Nvidia, AMD also plans to release new AI chips every year
.
In June this year, AMD announced its iterative roadmap, planning to launch MI325X in the fourth quarter of this year, and successively launch MI350 series and MI400 series in the next two years. Among them, MI300X and MI325X will adopt CDNA3 architecture, MI350 will adopt CDNA4 structure, and MI400 will adopt the next-generation CDNA architecture.
AMD will launch new product lines every year in the future, a pace that is in line with Nvidia's release plan.
AMD reiterated this plan at this earnings conference.
In June this year,
Su Zifeng announced new progress on AMD Instinct products
"
MI325 will be launched later this year, and the MI350 series will be launched next year. Just like Nvidia's Blackwell architecture, we are also on the road to CDNA. I still think the market needs more computing
." Su Zifeng also emphasized that MI350 is as "extremely competitive" as Blackwell.
At the same time,
AMD also cooperated with technology giants such as Google, Meta, Microsoft, Intel, Broadcom, Cisco, and HP to announce the establishment of a new alliance to promote the industry standard of Ultra Accelerator Link (UALink)
to break through the NV Link technology barriers, which is a bus and its communication protocol developed by NVIDIA
.
In addition, AMD has also accelerated its AI deployment through investment. In July this year, AMD invested $665 million to acquire Silo AI, Europe's largest private artificial intelligence laboratory, which provides end-to-end AI-driven solutions. This acquisition is considered an important step for AMD to catch up with Nvidia.
Su Zifeng said in the conference call that
in addition to the acquisition of Silo AI, AMD has invested more than $125 million in more than a dozen artificial intelligence companies in the past 12 months to expand the AMD ecosystem. She said AMD will continue to invest in software
.
With this combination of measures, Su is confident about AMD's data center business. She predicts that
by 2027, the AI chip market could grow to $400 billion, far higher than the estimate of Gartner, a global IT research and consulting firm. Gartner believes that the AI chip market is expected to grow by 33% in 2024 and is expected to approach $100 billion in the next two years
.
Global AI chip market size (billion US dollars). Data source: Gartner
Obviously, Lisa Su is very confident in the market, and even more so in AMD itself.
After all,
unlike Nvidia, which has a smaller presence in the CPU market, AMD has the industry's most comprehensive CPU+GPU+FPGA+DPU data center product portfolio, which can cover all scenarios of AI data computing needs
.
As of now, AMD's data center product line has covered EYPC server processors, Instinct GPU accelerators, Xilinx's data center FPGA and adaptive SoC, and Pensando's DPU.
Among them, Ryzen CPU and EPYC CPU can be used for training and reasoning small to medium-sized models; EPYC CPU, Radeon GPU and Versal chips equipped with AI engines will cover the training and reasoning of medium to large models; Instinct GPU and Xilinx's adaptive chips will cover the training and reasoning of ultra-large models.
According to Bloomberg research data,
the compound annual growth rate of the generative AI market is expected to reach 42%. It will be driven by AI training in the short term and gradually shift to reasoning of large language models, digital advertising, and professional software and service applications in the medium and long term
.
Therefore, in the face of shifts in mid- to long-term market demand,
AMD's broad product portfolio may enable it to seize more growth opportunities and engage in differentiated competition with Nvidia in the data center field
.
Su Zifeng also believes that
data center workloads are becoming more and more specialized, and AMD's broad data center product portfolio can use the right compute for the right workload
.
03
The second quarter financial report data is good, and the third quarter is even better
In addition to data center business,
AMD also has three major segments: client, gaming and embedded business, and its product types include processors, graphics cards, FPGAs,
etc.
Among them, the client business includes notebook, desktop and workstation CPUs and APUs; the gaming business includes Radeon series desktop and notebook GPUs, and game console semi-custom SoCs; the embedded business includes Ryzen and EPYC embedded processors, as well as Xilinx's FPGA and adaptive SoC.
AMD's revenue structure continues to change
Among the four product lines, client and gaming businesses were AMD's main source of revenue in the past, exceeding 85% at one point, and still accounted for 75% of total revenue in 2021.
However,
with the acquisition of Xilinx and the development of data centers, AMD's revenue structure has changed. The data center business has become AMD's largest revenue department, with revenue of US$2.8 billion in the second quarter, accounting for about half of the company's revenue
.
AMD has also repeatedly emphasized that
AI business is currently the company's top strategic priority, and the next goal is to further expand the proportion of data center revenue
.
According to the second quarter 2024 financial report,
AMD's revenue was US$5.835 billion, compared with US$5.359 billion in the same period last year, a year-on-year increase of 9%
.
Among them, client business revenue was US$1.5 billion, a year-on-year increase of 49% and a month-on-month increase of 9%, mainly due to the sales of AMD Ryzen™ processors, accounting for 25.7% of total revenue.
The gaming division's revenue was $648 million, down 59% year-on-year and 30% month-on-month, mainly due to a decline in semi-custom revenue, which accounted for 10.3% of total revenue.
Embedded segment revenue was $861 million, down 41% year-over-year as customers continued to normalize their inventory levels, and accounted for 14.8% of total revenue.
In terms of profit,
the financial report shows that AMD's net profit was US$265 million, an increase of 881% year-on-year and 115% month-on-month
; earnings per share were US$0.16, a significant increase of 700% compared with earnings per share of US$0.02 in the same period last year, and a significant increase of 129% compared with earnings per share of US$0.07 in the previous quarter.
AMD Financial Data Information
Excluding certain one-time items, AMD's adjusted net profit in the second quarter was US$1.126 billion, a year-on-year increase of 19% and a month-on-month increase of 11%; adjusted earnings per share were US$0.69, a year-on-year increase of 19% and a month-on-month increase of 11%.
In terms of operating expenses, AMD's operating expenses in the second quarter were US$2.605 billion, a year-on-year increase of 5% and a quarter-on-quarter increase of 3%. Among them,
R&D expenses were US$1.583 billion, an increase compared with US$1.443 billion in the same period last year and also an increase compared with US$1.525 billion in the previous quarter
.
Operating profit was $269 million, compared with an operating loss of $20 million in the same period last year, an increase of 1,445% year-on-year; excluding certain one-time items, AMD's adjusted operating profit in the second quarter was $1.264 billion, an increase of 18% year-on-year.
In terms of gross profit, AMD's second quarter gross profit was $2.864 billion, up 17% year-on-year and 12% quarter-on-quarter. Excluding certain one-time items, adjusted gross profit was $3.101 billion, up 16% year-on-year and 8% quarter-on-quarter.
Overall,
AMD's performance slightly exceeded analysts' previous expectations
.
"The rapid development of generative AI is driving the demand for more computing in every market, creating tremendous growth opportunities for us to provide leading AI solutions across our business." Therefore, Su Zifeng believes that AMD's financial data in the second half of the year will be better.
For the third quarter,
AMD expects revenue to reach $6.7 billion, with a fluctuation of $300 million, or between $6.4 billion and $7 billion. The gross profit margin under US generally accepted accounting principles is expected to be around 53.5%
.
-END-
The content of this article is for communication and learning purposes only and does not constitute any investment advice. Some pictures are from the Internet and the copyright ownership has not been verified. It is not for commercial use. If there is any infringement, please contact us at info@gsi24.com.
▼
Highlights from previous issues
▼