GPUs are sold out, far more than CPUs
50% of server revenue. To be blunt, we think this is crazy enough.
If AMD CEO Lisa Su's predictions at this week's Advancing AI event in San Jose are true, it looks like we'll all have to go back and rework our AI models. Because Ms. Su said that the data center GPU market has reached a tipping point and experienced a fission explosion and is now flooding the deuterium and tritium cladding of nuclear bombs with neutrinos. At the same time, huge economic pressure will also produce secondary Secondary fusion reaction. (Hopefully, all this AI will solve the critical problem of generating electricity through nuclear fusion, which will be required for all AI systems to be installed. Not only will this save the planet in many ways, but it will also help us reverse the damage we’ve already done) Huge loss. And we can make money during the repair process.)
A year ago, when Su & Team first hinted at what the MI300 series of GPUs might look like, the company looked at all of its market research and its own product line and believed that the total addressable market for data center GPUs in 2023 could be in the $30 billion range It will grow at a compound annual growth rate of about 50% by 2027, reaching more than 150 billion US dollars by then.
"That felt like a big number," Su said during her keynote, "and given what we knew then and now about overall global data center hardware revenue from the likes of IDC and Gartner, we agree that's true."
"However, when we look back at what has happened over the past 12 months and the rates and speed of adoption we've seen across industries, customers and globally, it's clear that demand is growing much, much faster. ," Su continued. "So if you look at now, the AI-enabled infrastructure -- of course, it's starting in the cloud, but it's also going to be in the enterprise. We believe that across the embedded market and in personal computing, we're going to see a lot of artificial intelligence Smart. We now expect data center accelerator TAM to grow over 70% annually over the next four years, to over $400 billion as an industry, and I have to say, does that sound exciting? For someone like me who's been in this industry for a while, this pace of innovation is faster than anything I've seen before.
Now, this is only the TAM for the data center part of the AI accelerator business and does not include the TAM for the edge and client AI hardware accelerators. AMD did not discuss what its TAM is for the broader chip market. A few months ago, IDC predicted that the entire server business would be worth just under $200 billion by 2027.
If calculated backwards, the cost of GPU using artificial intelligence servers accounts for about 53% of the overall price, which is about US$200,000 of the US$375,000 purchase price based on the eight-way Nvidia HGX GPU computing compound computer. Then if we predict IDC If the guesses about the server market and the division of labor between AI and non-AI servers are correct, then AI accelerator hardware should only drive about $50 billion of GPU development in 2027. Clearly, someone needs to revise their estimates of how this will all play out, and what exactly AMD means by its "data center AI accelerator." It definitely refers to GPUs and NNP, but it could also refer to some portion of CPU sales.
Regardless, what AMD is saying is that growth is ahead of schedule and accelerator revenue growth will be 9x higher than we estimated a few months ago.
We think this is absolutely crazy. Assuming that prices for graphics processors and other accelerators drop along with HBM memory, if this all plays out the way Schwarzman's team expects, that means sales will be truly massive. With such growth, there is ample room for competition and profit margins for many suppliers.
Featured Posts