[News] Memory Giants Samsung & Micron Actively Embrace DDR5 and HBM as Favorable Choices

This year, increasing demand for ChatGPT, along with ongoing innovations in PC and server technologies, has driven a growing market preference for high-value DRAM chips such as HBM and DDR5. Memory giants are collectively and actively positioning themselves in the production of these products.

DDR5: Micron Unveils New Products, Samsung Plans Line Expansion

The current DDR5 process has advanced to 1β DRAM. In October, Micron announced the release of DDR5 memory based on 1β technology, boasting speeds of up to 7200 MT/s. This product is now shipping to all customers in the data centers and PC markets.

Recently, Micron introduced the 128GB DDR5 RDIMM memory, utilizing 32Gb chips. With speeds of up to 8000 MT/s, it is suitable for servers and workstations. It also employs Micron’s 1β technology, and achieves a 24% improvement in energy efficiency and a 16% reduction in latency. Micron plans to launch models with speeds of 4800 MT/s, 5600 MT/s, and 6400 MT/s in 2024, with a future model reaching 8000 MT/s.

On the other hand, memory giant Samsung is committed to increasing DDR5 production capacity. Reports suggest that Samsung is planning to expand the production of high-value DRAM, investing in the infrastructure for advanced DRAM and increasing R&D spending to solidify its long-term market dominance.

Samsung, report as per KED Global News, is internally considering expanding DDR5 production lines. Given the high value of DDR5 and its adoption in the PC and server markets, this year is essentially regarded as the “year of widespread DDR5 adoption.”

HBM: Expansion Trend Begins, Significant Revenue Growth Expected

Amid the AI boom, HBM continues to gain popularity with demand supply outpacing. To meet this demand, storage giants are actively expanding production.

Recent reports indicate that companies like Samsung are planning to increase HBM production by 2.5 times. Additionally, in early November, it was reported that Samsung, to expand HBM capacity, acquired certain buildings and equipment within the Samsung Display Cheonan Factory. Samsung plans to establish a new packaging line at Cheonan for large-scale HBM production, having spent 10.5 billion Korean won on the acquisition and planning additional investments ranging from 700 billion to 1 trillion Korean won.

Micron, on the other hand, announced the official activation of its Taiwan-based Taichung Fab on November 6th. This facility will integrate advanced probe and 3D- packaging test, producing HBM3E and other products to meet the growing demand in various applications such as AI, data centers, edge computing, and the cloud.

TrendForce indicates that HBM, a memory embedded in high-end AI chips, is primarily supplied by three major vendors: Samsung, SK Hynix, and Micron. With the AI trend driving demand for AI chips, demand for HBM is also expected to increase in 2023 and 2024, prompting manufacturers to ramp up HBM production.

Looking ahead to 2024, the supply of HBM is expected to improve significantly. In terms of specifications, as AI chips demand higher performance, the mainstream for HBM in 2024 is expected to shift to HBM3 and HBM3e. Overall, with increased demand and higher average selling prices for HBM3 and HBM3e compared to the previous generation, HBM revenue is expected to see significant growth in 2024.
(Image: Samsung)

Explore more

  • Page 1
  • 1 page(s)
  • 1 result(s)