DRAM


2024-02-05

[News] SK hynix’s HBM4 Reportedly to Enter Mass Production in 2026

During the “SEMICON Korea 2024” event held recently in Seoul, Chun-hwan Kim, Vice President of global memory giant SK hynix, revealed that the company’s HBM3e has entered mass production, with plans to commence large-scale production of HBM4 in 2026.

According to a report from Business Korea, Chun-hwan Kim stated that SK hynix’s HBM3e memory is currently in mass production, with plans to initiate mass production of HBM4 in 2026.

He noted that with the advent of the AI computing era, generative AI is rapidly advancing, and the market is expected to grow at a rate of 35% annually. The rapid growth of the generative AI market requires a significant number of higher-performance AI chips to support it, further driving the demand for higher-bandwidth memory.

He further commented that the semiconductor industry would face intense survival competition this year to meet the increasing demand and customer needs for memory.

Kim also projected that the HBM market would grow by 40% by 2025, with SK hynix already strategically positioning itself in the market and planning to commence production of HBM4 in 2026.

Meanwhile, previous reports have also indicated that SK hynix expected to establish an advanced packaging facility in the state of Indiana, USA, to meet the demands of American companies, including NVIDIA.

Driven by the wave of AI advancement and demand from China, the Ministry of Trade, Industry and Energy of South Korea recently announced that South Korea’s semiconductor product exports experienced a rebound in 2024. In January, exports reached approximately USD 9.4 billion, marking a year-on-year increase of 56.2% and the largest growth in 73 months.

TrendForce has previously reported the progress of HBM3e, as outlined in the timeline below, which shows that SK hynix already provided its 8hi (24GB) samples to NVIDIA in mid-August.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Business Korea.

2024-02-02

[News] Reports Suggest SK Hynix to Establish Advanced Packaging Facility in the US

According to sources cited by the Financial Times, South Korean chip manufacturer SK Hynix is reportedly planning to establish a packaging facility in Indiana, USA. This move is expected to significantly advance the US government’s efforts to bring more artificial intelligence (AI) chip supply chains into the country.

SK Hynix’s new packaging facility will specialize in stacking standard dynamic random-access memory (DRAM) chips to create high-bandwidth memory (HBM) chips. These chips will then be integrated with NVIDIA’s GPUs for training systems like OpenAI’s ChatGPT.

Per one source close to SK Hynix cited by the report, the increasing demand for HBM from American customers and the necessity of close collaboration with chip designers have deemed the establishment of advanced packaging facilities in the US essential.

Regarding this, SK Hynix reportedly responded, “Our official position is that we are currently considering a possible investment in the US but haven’t made a final decision yet.”

The report quoted Kim Yang-paeng, a researcher at the Korea Institute for Industrial Economics and Trade, as saying, “If SK Hynix establishes an advanced HBM memory packaging facility in the United States, along with TSMC’s factory in Arizona, this means Nvidia can ultimately produce GPUs in the United States.”

Previously, the United States was reported to announce substantial chip subsidies by the end of March. The aim is to pave the way for chip manufacturers like TSMC, Samsung, and Intel by providing them with billions of dollars to accelerate the expansion of domestic chip production.

These subsidies are a core component of the US 2022 “CHIPS and Science Act,” which allocates a budget of USD 39 billion to directly subsidize and revitalize American manufacturing.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from Financial Times.

2024-02-02

[News] Samsung Reportedly Adjusts DRAM and NAND Flash Capacity to Boost Prices

Samsung’s latest financial report reveals that the fourth-quarter shipments of DRAM and NAND Flash in 2023 exceeded previous expectations, reflecting an improvement in market demand. Samsung will continue selectively adjusting the production capacity of specific DRAM and NAND Flash products to boost prices.

Samsung Electronics’ memory business is expected to return to profit in the first quarter of 2024, signaling a recovery in the memory industry. Commercial Times reports that due to inventory improvements, Samsung’s utilization rate of DRAM is projected to increase from 70% in the fourth quarter of 2024 to 81% in the first quarter of 2024, and further rise to 89% in the second quarter.

According to industry sources cited in the Commercial Times’ report, Samsung’s fourth-quarter shipments of DRAM and NAND Flash in 2023 exceeded previous expectations. This was primarily attributed to Samsung’s memory experiencing a smaller price increase compared to its competitors, thereby accelerating the pace of inventory clearance, particularly in the case of DRAM, where improvements were more significant.

Samsung is expected to continue selectively adjusting the production of DRAM and NAND Flash products. As the first quarter is typically a slow season for the industry, Samsung anticipates a sequential decline in DRAM and NAND Flash shipments in the first quarter of 2024. However, prices are expected to continue rising.

Due to the destocking of Samsung’s DRAM for eight to ten weeks, it is expected to return to normal level by the end of the 1st quarter of 2024. Meanwhile, NAND Flash inventory is projected to normal level within the first half of 2024.

At the same time, Samsung plans to commence production of HBM3e 24GB products in the first half of 2024, with HBM3e 36GB products slated for production in the second half of the year, with progress ahead of schedule. Additionally, the development of the next-generation HBM4 is currently underway, with samples expected to be released in 2025 and mass production in 2026.

As per sources cited by the Commercial Times, reportedly, regarding HBM3 and HBM3e, HBM3 used in AI servers is still exclusively supplied by SK Hynix, with the highest yield in backend packaging, followed by Micron. Meanwhile, the report also indicates that HBM3e is expected to begin mass production in the first quarter of 2024. Micron’s outsourcing of backend TSV and stacking to TSMC has accelerated the product’s production speed.

As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Commercial Times.

2024-02-02

[News] The Quiet Beginning of the 3D DRAM Market Share Battle

From the current landscape of publicly available DRAM technologies, the industry is expected to perceive 3D DRAM as one of the solutions to the challenges faced by DRAM technology, marking it as a pivotal direction for the future memory market.

Is 3D DRAM similar to 3D NAND? How will the industry address technological bottlenecks such as size limitations? What are the strategies of major players in the field?

  • Understanding 3D DRAM Technology

The circuitry of DRAM consists of a transistor and a capacitor, where the transistor is responsible for transmitting electrical currents to write or read information (bits), while the capacitor stores the bits.

DRAM finds wide application in modern digital electronic devices such as computers, graphics cards, portable devices, and gaming consoles, due to its low cost and high capacity memory.

The development of DRAM primarily focuses on increasing integration by reducing circuit line widths. However, as line widths reach the 10nm range, physical limitations such as capacitor current leakage and interference significantly increase.

To address these issues, the industry has introduced new materials and equipment like high dielectric constant (high-K) deposition materials and Extreme Ultraviolet (EUV) devices.

Nevertheless, from the perspective of chip manufacturers, miniaturizing the manufacturing of 10nm or more advanced chips remains a significant challenge in current technology research and development. Additionally, the competition for advanced processes, particularly at 2nm and below, has intensified recently.

In an era marked by continuous technological advancements, the semiconductor industry has turned its attention to the evolution of NAND technology. To overcome scaling limitations, transistors are transitioning from a planar to a 3D architecture, increasing the number of storage units per unit area. This concept of 3D DRAM architecture has entered the public sphere.

In traditional DRAM, transistors are integrated on a flat plane. However, in 3D DRAM, transistors are stacked into multiple layers, thereby dispersing the transistors. It is believed that adopting a 3D DRAM structure can widen the gaps between transistors, reducing leakage currents and interference.

From a theoretical perspective, 3D DRAM technology breaks the conventional paradigm of memory technology. It is a novel storage method that stacks storage cells above logic units, enabling higher capacities within a unit chip area.

In terms of differentiation, traditional DRAM requires complex operational processes for reading and writing data, whereas 3D DRAM can directly access and write data through vertically stacked storage units, significantly enhancing access speeds. The advantages of 3D DRAM not only include high capacity and fast data access but also low power consumption and high reliability, meeting various application needs.

In terms of application areas, the high speed and large capacity of 3D DRAM will help improve the efficiency and performance of high-performance computing. The compact size and large capacity of 3D DRAM make it an ideal memory solution for mobile devices. The large capacity and low power consumption characteristics of 3D DRAM can meet the real-time data processing and transmission requirements of the Internet of Things (IoT) field.

Furthermore, since the advent of the AI era with ChatGPT, AI applications have surged, and AI servers are expected to become a strong driving force for the long-term growth in storage demand.

Micron’s chief business officer previously stated in an interview with Reuter that a typical AI server has up to eight times the amount of DRAM and three times the amount of NAND that a normal server has.

  • Continued Industry Focus on 3D DRAM

The DRAM market remains highly concentrated, currently dominated by key players such as Samsung Electronics, SK Hynix, and Micron Technology, collectively holding over 93% of the entire market share.

According to a report from TrendForce, as of the third quarter of 2023, Samsung leads the global market with a share of 38.9%, followed by SK Hynix (34.3%) and Micron Technology (22.8%).

Currently, 3D DRAM is in its early stages of development, with companies like Samsung actively joining the research and development battleground. The competition is intense as various players strive to lead in this rapidly growing market.

  • Samsung: 4F2 DRAM

Since 2019, Samsung has been conducting research on 3D DRAM and announced the industry’s first 12-layer 3D-TSV (Through-Silicon Via) technology in October of the same year. In 2021, Samsung established a next-generation process development research team within its DS division, focusing on research in this field.

At the 2022 SAFE Forum, Samsung outlined the overall 3DIC journey of Samsung Foundry and indicated its readiness to address DRAM stacking issues with a logic-stacked chip, SAINT-D. The design aims to integrate eight HBM3 chips onto one massive interposer chip.

In May 2023, as per sources cited by “The Elec,” Samsung Electronics formed a development team within its semiconductor research center to mass-produce 4F2 structured DRAM.

The goal is reportedly to apply 4F2 to DRAM at 10nm processes or more advanced nodes, as DRAM cell scaling has reached its limit. The report suggests that if Samsung’s 4F2 DRAM storage unit structure research is successful, the chip die area can be reduced by around 30% compared to existing 6F2 DRAM storage unit structures without changing the node.

In October of the same year, at the “Memory Technology Day” event, Samsung Electronics announced its plans to introduce a new 3D structure in the next-generation 10-nanometer more advanced nodes DRAM, rather than the existing 2D planar structure. The aim of this project is to increase the production capacity of a chip by over 100G.

At the “VLSI Symposium” held in Japan last year, Samsung Electronics presented a paper containing research results on 3D DRAM and showcased detailed images of 3D DRAM as an actual semiconductor implementation.

According to a report by The Economic Times, Samsung Electronics recently announced the opening of a new R&D laboratory in Silicon Valley, USA, dedicated to the development of next-generation 3D DRAM.

The laboratory is operated under Silicon Valley’s Device Solutions America (DSA) and is responsible for overseeing Samsung’s semiconductor production in the United States, as well as focusing on the development of new generations of DRAM products.

  • SK Hynix – Introducing IGZO as the Channel Material for Future DRAM

Per SK Hynix’s research, the IGZO channel is attracting attention to improve the refresh characteristics of DRAM.

Reportedly, IGZO thin film transistors have been used in the display industry for a long time due to their moderate carrier mobility, extremely low leakage current and substrate size scalability. It can be a candidate for a stackable channel material for future DRAM.

  • NEO – 3D X-DRAM Offers 8x Density Boost

NEO Semiconductor, a US memory technology company, introduces its groundbreaking technology, 3D X-DRAM, aimed at overcoming the capacity limitations of DRAM.

3D X-DRAM features the first-ever array structure of DRAM units based on Floating Body Cell (FBC) technology, akin to 3D NAND. Similar to 3D NAND Flash, its logic involves stacking layers to increase memory capacity. The FBC technology in 3D NAND Flash enables the formation of a vertical structure with the addition of a layer mask, offering high yield, low cost, and a significant density boost.

According to Neo’s estimates, the 3D X-DRAM technology can achieve a density of 128 Gb across 230 layers, which is eight times the current density of DRAM. NEO proposes a target of an eightfold capacity increase every decade, aiming to achieve a capacity of 1Tb between 2030 and 2035, representing a 64-fold increase compared to the current core capacity of DRAM.

This expansion is intended to meet the growing demand for high-performance and large-capacity semiconductor storage, especially for AI applications like ChatGPT.

“3D X-DRAM will be the absolute future growth driver for the Semiconductor industry,” said Andy Hsu, Founder and CEO of NEO Semiconductor.

  • Japanese Research Team: BBCube 3D Outperforms DDR5 by 30x

A research team at the Tokyo Institute of Technology in Japan has introduced a groundbreaking 3D DRAM stacking design technology called BBCube, which enables superior integration between processing units and DRAM.

The most significant aspect of BBCube 3D lies in achieving a three-dimensional connection between processing units and DRAM instead of the traditional two-dimensional linkages. The team employs an innovative stacking structure while using an innovative stacked structure in which the PU dies sit atop multiple layers of DRAM, all interconnected via through-silicon vias (TSVs).

The overall structure of BBCube 3D is compact, devoid of typical solder microbumps, and utilizes TSVs instead of longer wires, collectively contributing to achieving low parasitic capacitance and low resistance, thereby enhancing the electrical performance of the device in various aspects.

The research team evaluated the speed of the new architecture and compared it with two of the most advanced memory technologies, DDR5 and HBM2E. Researchers claim that BBCube 3D could potentially achieve a bandwidth of 1.6 terabytes per second, which is 30 times higher than DDR5 and 4 times higher than HBM2E.

Furthermore, due to features like low thermal resistance and low impedance in BBCube, potential thermal management and power issues associated with 3D integration could be mitigated. The new technology significantly improves bandwidth while consuming only 1/20 and 1/5 of the bit access energy compared to DDR5 and HBM2E, respectively.

  • Conclusion

The evolution of DRAM technology from 1D to 2D and now to the diverse structures of 3D has offered the industry various solutions to address its challenges. However, optimizing and improving manufacturing costs, durability, and reliability remain significant challenges in advancing 3D DRAM technology. Due to the difficulties in developing new materials and physical limitations, the commercialization of 3D DRAM still requires some time.

Based on the current research progress, the industry is actively engaged in the development of 3D DRAM, which are still in the early stages. According to industry insiders, it is predicted that 3D DRAM will begin to emerge around 2025, with actual mass production becoming feasible after 2030.

Read more

(Photo credit: Samsung)

Please note that this article cites information from DRAMeXchangeThe Economic Times, and The Elec.

2024-01-30

[News] Latest Updates on HBM from the Leading Three Global Memory Manufacturers

Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow.

HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. It accelerates AI data processing speed and is particularly suitable for high-performance computing scenarios like ChatGPT, making it highly valued by memory giants in recent years.

Memory is also representing one of Korea’s pillar industries, and to seize the AI opportunity and drive the development of the memory industry, Korea has recently designated HBM as a national strategic technology.

The country will provide tax incentives to companies like Samsung Electronics. Small and medium-sized enterprises in Korea can enjoy up to a 40% to 50% reduction, while large enterprises like Samsung Electronics can benefit from a reduction of up to 30% to 40%.

Overview of HBM Development Progress Among Top Manufacturers

The HBM market is currently dominated by three major storage giants: Samsung, SK Hynix, and Micron. Since the introduction of the first silicon interposer HBM product in 2014, HBM technology has smoothly transitioned from HBM, HBM2, and HBM2E to HBM3 and HBM3e through iterative innovation.

According to research by TrendForce, the mainstream HBM in the market in 2023 is HBM2e. This includes specifications used in NVIDIA A100/A800, AMD MI200, and most CSPs’ self-developed acceleration chips. To meet the evolving demands of AI accelerator chips, various manufacturers are planning to launch new products like HBM3e in 2024, expecting HBM3 and HBM3e to become the market norm.

The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.

As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.

Meeting Demand, Manufacturers Actively Expand HBM Production

As companies like NVIDIA and AMD continue to introduce high-performance GPU products, the three major manufacturers are actively planning the mass production of HBM with corresponding specifications.

Previously, media reports highlighted Samsung’s efforts to expand HBM production capacity by acquiring certain buildings and equipment within the Samsung Display’s Cheonan facility.

Samsung plans to establish a new packaging line at the Cheonan plant dedicated to large-scale HBM production. The company has already invested KRW 10.5 trillion in the acquisition of the mentioned assets and equipment, with an additional investment of KRW 700 billion to KRW 1 trillion.

Micron Technology’s Taichung Fab 4 in Taiwan was officially inaugurated in early November 2023. Micron stated that Taichung Fab 4 would integrate advanced probing and packaging testing functions to mass-produce HBM3e and other products, thereby meeting the increasing demand for various applications such as artificial intelligence, data centers, edge computing, and the cloud. The company plans to start shipping HBM3e in early 2024.

In its latest financial report, SK Hynix stated that in the DRAM sector in 2023, its main products DDR5 DRAM and HBM3 experienced revenue growth of over fourfold and fivefold, respectively, compared to the previous year.

At the same time, in response to the growing demand for high-performance DRAM, SK Hynix will smoothly carry out the mass production of HBM3e for AI applications and the research and development of HBM4.

Read more

(Photo credit: SK Hynix)

  • Page 1
  • 12 page(s)
  • 60 result(s)