About TrendForce News

TrendForce News operates independently from our research team, curating key semiconductor and tech updates to support timely, informed decisions.

[News] SK hynix, Samsung, and SanDisk Bet on HBF — The Next Battleground in Memory Sector


2025-11-11 Semiconductors editor

Kim Jung-Ho, a professor at KAIST and widely known as the “father of HBM (High Bandwidth Memory),” recently made a striking statement on a YouTube program: “In the AI era, the balance of power is shifting — from GPU to memory.”

According to Kim, memory will play an increasingly pivotal role in the age of artificial intelligence, to the point that NVIDIA may one day acquire a memory company. He also highlighted the emerging significance of HBF (High Bandwidth Flash), predicting new progress by early 2026 and formal debut between 2027 and 2028.

As the traditional hard disk drive (HDD) industry undergoes a difficult transition toward heat-assisted magnetic recording (HAMR) technology with high initial cost, Nearline SSD is gaining traction as a more cost-effective alternative. In parallel, HBF is being viewed as a key technology to overcome the storage capacity bottleneck in AI clusters.

In the AI inference era, memory capacity is becoming more critical than ever. How leading chipmakers optimize memory usage is now a defining factor in performance. One notable example is key-value (KV) caching, which serves as the short-term memory of AI models and directly affects response speed. Kim thereby believes HBF will emerge as a major next-generation memory technology alongside HBM, driving growth for semiconductor manufacturers.

Conceptually, HBF shares similarities with HBM—both utilize through-silicon via (TSV) technology to vertically stack multiple chips. However, HBF is built using NAND flash memory, offering remarkably larger capacity and lower cost advantages.

SanDisk and SK hynix signed a memorandum of understanding (MoU) in August 2025 to jointly define HBF technical specifications and promote standardization. Their goal is to release HBF memory samples in the second half of 2026, with the first AI inference systems using HBF expected to debut in early 2027. Notably, at the 2025 OCP Global Summit held in mid-October, SK hynix unveiled its new “AIN Family” of storage products, including the AIN B series powered by HBF technology.

Meanwhile, Samsung Electronics has reportedly begun early concept design work on its own HBF products. Sources indicate that Samsung aims to leverage its prior R&D experience in similar high-performance storage technologies to address the rising demand for data center–oriented high-bandwidth flash memory. However, the project remains in its early stage, with detailed specifications and mass production timelines yet to be finalized.

Kim emphasized that although NAND flash is slower than DRAM, its capacity can exceed DRAM by more than tenfold. By stacking hundreds or even thousands of layers, HBF could meet the massive storage demands of AI models — effectively becoming the NAND-based counterpart to HBM. As he boldly predicted during the program: “The HBM era is ending — the HBF era is coming.”

Looking ahead, Kim envisions a multi-tiered memory hierarchy for future AI architectures, akin to an intelligent library system:SRAM inside GPU will serve as the notebook on the desk — fastest but smallest in capacity. HBM will act as the nearby bookshelf for rapid access and computation. HBF will be the underground library — storing deep AI knowledge and continuously feeding data to HBM. Cloud storage, meanwhile, will function as the public library, connecting data centers via optical networks. Ultimately, Kim foresees GPU integrating both HBM and HBF in a complementary configuration — marking a new era in the convergence of computing and memory for AI.

(Photo credit: SanDisk)


Get in touch with us