News

[News] Seizing the AI Trend! Revealing Samsung and Micron’s HBM Expansion Timetable


2023-11-08 Semiconductors editor

In a subdued environment for consumer electronic applications in the storage market, High Bandwidth Memory (HBM) technology is emerging as a new driving force, gaining significant attention from major players. Recent reports reveal that both Samsung and Micron are gearing up for substantial HBM production expansion.

Major Manufacturers Actively Investing in HBM

Recent reports indicate that Samsung has acquired certain buildings and equipment within the Cheonan facility of Samsung Display in South Korea to expand its HBM production capacity.

It is reported that Samsung plans to establish a new packaging line at the Cheonan facility for large-scale HBM production. The company has already spent 10.5 billion Korean won on the acquisition of the mentioned buildings and equipment, with an additional investment expected to range between 700 billion and 1 trillion Korean won.

Earlier, it was disclosed by Mr. Hwang Sang-jun, the Vice President of Samsung Electronics and Head of the DRAM Product and Technology Team, that Samsung has developed HBM3E with a speed of 9.8Gbps and plans to commence providing samples to customers.

Concurrently, Samsung is in the process of developing HBM4 with the objective of making it available by 2025. It is reported that Samsung Electronics is actively working on various technologies for HBM4, including non-conductive adhesive film (NCF) assembly techniques optimized for high-temperature thermal characteristics and hybrid bonding (HCB).

On November 6th, Micron Technology opened a new facility in Taichung. Micron has stated that this new facility will integrate advanced testing and packaging functions and will be dedicated to the mass production of HBM3E, along with other products. This expansion aims to meet the increasing demand across various applications such as artificial intelligence, data centers, edge computing, and cloud services.

Previously, Micron’s CEO, Sanjay Mehrotra, revealed that the company plans to commence substantial shipments of HBM3E in early 2024. Micron’s HBM3E technology is currently undergoing certification by NVIDIA. The initial HBM3E offerings will feature an 8-Hi stack design with a capacity of 24GB and a bandwidth exceeding 1.2TB/s.

Furthermore, Micron intends to introduce larger-capacity 36GB 12-Hi stacks HBM3E in 2024. In an earlier statement, Micron had anticipated that the new HBM technology would contribute “hundreds of millions” of dollars in revenue by 2024.

Shift Toward HBM3 Expected in 2024

According to TrendForce, the current mainstream technology in the HBM market is HBM2e. This specification is utilized by prominent players like NVIDIA with their A100 and A800, AMD with the MI200 series, and various custom system-on-chip designs by CSPs.

Simultaneously, in response to the evolving demand for AI accelerator chips, many manufacturers are planning to introduce new products based on HBM3e technology in 2024. It is anticipated that both HBM3 and HBM3e will become the dominant technologies in the market next year, catering to the requirements of AI accelerator chips.

Regarding the demand for different generations of HBM, TrendForce believes that the primary demand is shifting from HBM2e to HBM3 in 2023, with an anticipated demand ratio of approximately 50% and 39%, respectively. As the usage of HBM3-based accelerator chips continues to increase, the market demand is expected to see a substantial shift towards HBM3 in 2024.

It is anticipated that in 2024, HBM3 will surpass HBM2e, with an estimated share of 60%. This transition to HBM3 is expected to be accompanied by higher average selling prices (ASP), significantly boosting next year’s HBM revenue.

Read more

(Photo credit: Samsung)

Get in touch with us