Article count:25311 Read by:103629709

datasheet recommend     
Account Entry

HBM3E first exposed, Korean manufacturers are far ahead

Latest update time:2023-05-31
    Reads:

Source: Content synthesized by Semiconductor Industry Observer (ID: icb ank ) anandtech, thank you.


Due to the popularity of applications such as artificial intelligence, the market demand for HBM has increased dramatically, but looking at this market, South Korea's two storage giants, Samsung and SK Hynix, are becoming the only two winners in this market.


Recently, SK Hynix has brought new HBM3E product information disclosure. In a note about validating the company's 1bnm fab process, the manufacturer said for the first time that it is working on next-generation HBM3E memory, which will achieve speeds of up to 8 Gbps/pin and will debut in 2024.


Contemporary HBM3 memories from SK Hynix and other vendors support data transfer rates up to 6.4Gbps/pin, so HBM3E with 8 Gbpis/pin transfer rates will provide a moderate bandwidth advantage of 25% over existing storage devices.


To put this into context, using a single HBM stack with a 1024-bit wide memory bus would provide approximately 1 TB/sec of bandwidth for HBM3E's Known Good Stacked Die (KGSD), up from 819.2 GB/sec for HBM3's current generation. Second. Among them, modern HPC-class processors use six (or more) stacks, which will calculate terabytes/second of bandwidth for these high-end processors.


According to the company's instructions, SK Hynix intends to start sampling its HBM3E memory in the next few months and start mass production in 2024. The memory manufacturer hasn't revealed many details about the HBM3E (in fact, this is the first time it's been revealed without mentioning its specs at all), so we don't know if these devices will be directly compatible with existing HBM3 controllers and physical interfaces.



Assuming SK Hynix's HBM3E development goes as planned, the company should have no trouble lining up customers for the faster memory. Especially as demand for GPUs used to build AI training and inference systems surges, NVIDIA and other processor vendors are more willing to pay top dollar for the advanced memory they need to produce faster processors during this industry boom.


SK Hynix will produce HBM3E memory using its 1b nanometer manufacturing technology (5th generation 10 nanometer scale node), which is currently used to make DDR5-6400 memory chips, which will be validated for Intel's next-generation Xeon Scalable platform . In addition, this manufacturing technology will be used to manufacture LPDDR5T memory chips with both high performance and low power consumption.


South Korea's duo, far ahead


According to Korean media businesskorea, high-bandwidth memory (HBM) orders from Samsung Electronics and SK Hynix have been surging since the beginning of this year. Compared to other DRAMs, HBM significantly increases data processing speed by vertically connecting multiple DRAMs. They work in conjunction with the central processing unit (CPU) and graphics processing unit (GPU) to greatly improve the server's learning and computing performance.


To date, despite its excellent performance, HBM has had fewer applications than general DRAM. This is because the average selling price (ASP) of HBM is at least three times that of DRAM. HBM requires complex production processes and highly advanced technology. The expansion of artificial intelligence services has turned the tables.


Nvidia, the world's largest GPU company, has been asking SK Hynix to provide its latest product HBM3 chips. Intel, the world's number one server CPU company, is also working hard to sell products equipped with SK Hynix HBM3. An industry insider said, "Compared with the highest performance DRAM, the price of HBM3 has increased by 5 times."


Product development competition between Samsung Electronics and SK Hynix is ​​heating up as the high-performance memory semiconductor market is expected to grow rapidly. The HBM market is still in its early stages as HBM starts entering AI servers in earnest from 2022, but SK Hynix and Samsung Electronics are focusing on securing customers through new product launches.


SK hynix leads the HBM market. It partnered with AMD in 2013 to develop the world's first HBM. The Korean chipmaker has released its first-generation HBM (HBM), second-generation HBM (HBM2), third-generation HBM (HBM2E) and fourth-generation HBM (HBM3) and has captured 60-70% of the market share .


In February 2021, Samsung Electronics collaborated with AMD to develop HBM-PIM, which combines memory semiconductors and AI processors into one. When HBM-PIM chips are installed on CPUs and GPUs, the computing speed of the server can be significantly increased. SK hynix also launched product solutions using PIM technology in February 2022.


In the medium to long term, experts predict that the development of AI-specific DRAM such as HBM will bring huge changes to the semiconductor industry. "The era when storage semiconductor companies were busy developing ultra-fine manufacturing processes is over," said an official in South Korea's semiconductor industry. “The development of AI semiconductor technology that processes data efficiently and even has the ability to process data will become so important that it will determine the future of chip manufacturers.

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号