As the demand for memory chips used in AI remains strong, prompting major memory companies to accelerate their pace on HBM3e and HBM4 qualification, SK hynix CEO Kwak Noh-jung stated on August 7 that driven by the high demand for memory chips like high-bandwidth memory (HBM), the market is expected to stay robust until the first half of 2025, according to a report by the Korea Economic Daily.
However, Kwak noted that the momentum beyond 2H25 “remains to be seen,” indicating that the company needs to study market conditions and the situation of supply and demand before making comments further. SK hynix clarified that was not an indication of a possible downturn.
According to the analysis by TrendForce, HBM’s share of total DRAM bit capacity is estimated to rise from 2% in 2023 to 5% in 2024 and surpass 10% by 2025. In terms of market value, HBM is projected to account for more than 20% of the total DRAM market value starting in 2024, potentially exceeding 30% by 2025.
SK hynix, as the current HBM market leader, said earlier in its earnings call in July that its HBM3e shipment is expected to surpass that of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.
The report notes that for now, the company’s major focus would be on the sixth-generation HBM chips, HBM4, which is under development in collaboration with foundry giant TSMC. Its 12-layer HBM4 is expected to be launched in the second half of next year, according to the report.
Samsung, on the other hand, had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls. On August 6, the company denied rumors that its 8-layer HBM3e chips had cleared NVIDIA’s tests.
Notably, per a previous report from the South Korean newspaper Korea Joongang Daily, following Micron’s initiation of mass production of HBM3e in February 2024, it has recently secured an order from NVIDIA for the H200 AI GPU.
Read more
(Photo credit: SK hynix)