Micron HBM, catching up
????If you want to see each other often, please mark it as a star???? and add it to your collection~
Source : Content compiled from businesskorea by Semiconductor Industry Observation (ID: i cbank) , thank you.
Micron Technology joins other leading artificial intelligence (AI) semiconductor companies such as Intel as it steps up its efforts to challenge the industry leader. On December 24, according to semiconductor industry sources, Micron, the world's third largest DRAM manufacturer, is currently conducting a quality assessment by major customers on the HBM3E it developed.
High-bandwidth memory ( HBM ) is a type of DRAM used in artificial intelligence servers, which is characterized by stacking multiple types of DRAM to enhance data processing capabilities and speed. The global HBM market, currently dominated by Samsung Electronics and SK Hynix, is expected to grow from approximately 2.5 trillion won this year to approximately 8 trillion won in 2028.
Micron Technology expects to have a market share of approximately 5% in 2023, ranking third. The company is betting big on its next-generation product, the HBM3E, to close the gap with the leaders. "We are in the final stages of validating HBM3E for Nvidia's next-generation AI accelerator," Micron CEO Sanjay Mehrotra said.
The AI accelerator market where HBM and GPU are bundled is also facing fierce competition. AI accelerators are semiconductors specialized for large-scale data training and inference and are considered critical in the era of generative AI.
Intel is making huge efforts in this area. On December 14, Intel launched the Gaudi3 next-generation AI accelerator prototype, which is 4 times faster than the previous generation and HBM is 1.5 times faster. Intel CEO Pat Gelsinger, who has been a vocal critic of Nvidia, the leader in AI accelerators, said: “As data inference [services] continue to grow in importance, Nvidia is focused on data training in an era It will also end."
The third-place counterattack in various AI semiconductor fields is driven by the market’s growth potential. According to AMD, the AI accelerator market is currently worth $30 billion and is expected to expand to $150 billion by 2027. The HBM market is expected to grow at an annual rate of 50% over the next five years.
The desire of customers to check the dominance of leading companies has also spurred new players to aggressively enter the market. For example, major AI accelerator market players such as Microsoft are urging companies such as AMD and Intel to develop alternatives to Nvidia. In the HBM market, one customer has made an advance payment of US$600 million to support Micron's new product development.
Market leaders are working to extend their leadership positions with new products. In the HBM market, Samsung Electronics and SK Hynix are developing the sixth generation HBM - HBM4, which is expected to use "hybrid bonding" technology to reduce size while increasing capacity. Nvidia is developing the "X100" artificial intelligence accelerator, expected to be released in 2025, which will increase memory usage to 400 GB.
Micron to supply HBM to Nvidia
Micron reiterated its plan to begin high-volume shipments of HBM3E memory in early 2024, and also revealed that NVIDIA is one of its major customers for the new RAM. At the same time, the company emphasized that its new product has received significant interest from across the industry, suggesting that NVIDIA may not be the only customer to end up using Micron's HBM3E.
"The launch of our HBM3E product family has generated strong customer interest and enthusiasm," Micron President and CEO Sanjay Mehrotra said on the company's earnings call.
The launch of HBM3E (also known by the company as HBM3 Gen2) ahead of rivals Samsung and SK Hynix is a big deal for Micron, which is an underdog in the HBM market with a 10% market share. The company clearly has high hopes for the HBM3E, as it could potentially allow it to deliver a quality product (to boost revenue and profits) ahead of its competitors (to gain market share).
Normally, memory makers tend not to reveal the names of their customers, but this time Micron emphasized that its HBM3E is part of its customer roadmap and specifically mentioned NVIDIA as an ally. Meanwhile, the only HBM3E-enabled product announced by NVIDIA so far is the Grace Hopper GH200 compute platform, which features an H100 compute GPU and a Grace CPU.
“We have worked closely with our customers throughout the development process and are becoming a closely integrated partner on their AI roadmaps,” said Mehrotra. “Micron HBM3E has now received NVIDIA Compute Product Qualification, which will drive HBM3E driven artificial intelligence solutions.”
Micron's 24GB HBM3E module is based on eight stacked 24Gbit memory chips and is manufactured using the company's 1β (1-beta) manufacturing process. The modules deliver data rates of up to 9.2 GT/s, bringing peak bandwidth per stack to 1.2 TB/s, a 44% improvement over the fastest existing HBM3 modules. In the meantime, the company isn't discontinuing its 8-Hi 24 Gbit-based HBM3E components. The company announced that it plans to launch the ultra-large-capacity 36 GB 12-Hi HBM3E stack in 2024, following the start of mass production of the 8-Hi 24GB stack.
"We expect to begin production of HBM3E in early 2024 and achieve substantial revenue in fiscal 2024," the Micron CEO added.
Micron 128 GB DDR5 memory module sample delivery
Micron Technology said on an earnings call this week that it is sampling 128 GB DDR5 memory modules based on its latest single-chip, non-stacked 32 Gb DDR5 memory devices that it launched earlier this summer. The device was announced at the time and will finally open the door to 1 TB memory modules for servers.
"We have expanded our high-capacity D5 DRAM module portfolio with monolithic-based 128 GB modules, and we have begun sampling to customers to help support their AI application needs," said Sanjay Mehrotra, Micron's president and CEO. "We expect the product to be revenue-generating in the second quarter of 2024."
Micron's 32Gb DDR5 chips are manufactured using the company's 1β (1-beta) manufacturing process, which is the last production node to rely solely on deep ultraviolet (DUV) lithography multi-patterning and not use extreme ultraviolet (EUV) lithography tools . That's what we know about Micron's 32 Gb DDR5 ICs so far, though: the company hasn't revealed its maximum speed range, but we can expect some power consumption compared to two 16 Gb DDR5 ICs running at the same voltage. decline. Data transfer rate.
Micron's new 32 Gb memory chips pave the way for the creation of standard 32 GB modules for PCs containing just eight separate memory chips, as well as 128 GB modules for servers based on 32 such ICs. In addition, these chips make it possible to produce memory modules with a capacity of 1 terabyte, which seems unachievable today. These 1 TB modules may seem overkill at the moment, but they will benefit areas such as artificial intelligence, big data and server databases. Such modules enable servers to support up to 12 TB of DDR5 memory per slot (in the case of a 12-channel memory subsystem).
Overall, when it comes to DDR5 memory, it's worth noting that the company expects DDR5 bit production to surpass DDR4 in early 2024, slightly ahead of the industry.
"Micron also has a strong position in the industry transition to D5," Mehrotra said. "We expect Micron's D5 sales to surpass D4 in early 2024, leading the industry."
Click here to follow and lock in more original content
END
*Disclaimer: This article is original by the author. The content of the article is the personal opinion of the author. The reprinting by Semiconductor Industry Watch is only to convey a different point of view. It does not mean that Semiconductor Industry Watch agrees or supports the view. If you have any objections, please contact Semiconductor Industry Watch.
Today is the 3627th issue of "Semiconductor Industry Observation" shared with you. Welcome to pay attention.
Recommended reading
★ EUV lithography machine blockbuster report, released in the United States
★ Silicon carbide "surges": catching up, involution, substitution
★ The chip giants all want to “kill” engineers!
★ Apple, playing with advanced packaging
★ Continental Group, developing 7nm chips
★
Latest interview with Zhang Zhongmou: China will find a way to fight back
"Semiconductor's First Vertical Media"
Real-time professional original depth
Public account ID: icbank
If you like our content, click "Watching" to share it with your friends.