Micron


2023-11-21

[News] Memory Giants Samsung & Micron Actively Embrace DDR5 and HBM as Favorable Choices

This year, increasing demand for ChatGPT, along with ongoing innovations in PC and server technologies, has driven a growing market preference for high-value DRAM chips such as HBM and DDR5. Memory giants are collectively and actively positioning themselves in the production of these products.

DDR5: Micron Unveils New Products, Samsung Plans Line Expansion

The current DDR5 process has advanced to 1β DRAM. In October, Micron announced the release of DDR5 memory based on 1β technology, boasting speeds of up to 7200 MT/s. This product is now shipping to all customers in the data centers and PC markets.

Recently, Micron introduced the 128GB DDR5 RDIMM memory, utilizing 32Gb chips. With speeds of up to 8000 MT/s, it is suitable for servers and workstations. It also employs Micron’s 1β technology, and achieves a 24% improvement in energy efficiency and a 16% reduction in latency. Micron plans to launch models with speeds of 4800 MT/s, 5600 MT/s, and 6400 MT/s in 2024, with a future model reaching 8000 MT/s.

On the other hand, memory giant Samsung is committed to increasing DDR5 production capacity. Reports suggest that Samsung is planning to expand the production of high-value DRAM, investing in the infrastructure for advanced DRAM and increasing R&D spending to solidify its long-term market dominance.

Samsung, report as per KED Global News, is internally considering expanding DDR5 production lines. Given the high value of DDR5 and its adoption in the PC and server markets, this year is essentially regarded as the “year of widespread DDR5 adoption.”

HBM: Expansion Trend Begins, Significant Revenue Growth Expected

Amid the AI boom, HBM continues to gain popularity with demand supply outpacing. To meet this demand, storage giants are actively expanding production.

Recent reports indicate that companies like Samsung are planning to increase HBM production by 2.5 times. Additionally, in early November, it was reported that Samsung, to expand HBM capacity, acquired certain buildings and equipment within the Samsung Display Cheonan Factory. Samsung plans to establish a new packaging line at Cheonan for large-scale HBM production, having spent 10.5 billion Korean won on the acquisition and planning additional investments ranging from 700 billion to 1 trillion Korean won.

Micron, on the other hand, announced the official activation of its Taiwan-based Taichung Fab on November 6th. This facility will integrate advanced probe and 3D- packaging test, producing HBM3E and other products to meet the growing demand in various applications such as AI, data centers, edge computing, and the cloud.

TrendForce indicates that HBM, a memory embedded in high-end AI chips, is primarily supplied by three major vendors: Samsung, SK Hynix, and Micron. With the AI trend driving demand for AI chips, demand for HBM is also expected to increase in 2023 and 2024, prompting manufacturers to ramp up HBM production.

Looking ahead to 2024, the supply of HBM is expected to improve significantly. In terms of specifications, as AI chips demand higher performance, the mainstream for HBM in 2024 is expected to shift to HBM3 and HBM3e. Overall, with increased demand and higher average selling prices for HBM3 and HBM3e compared to the previous generation, HBM revenue is expected to see significant growth in 2024.
(Image: Samsung)

Explore more

2023-11-17

[Insights] Signals from the Latest Financial Reports of Top 5 Global Storage Giants

As the memory market faces oversupply and falling prices due to declining demand in 2023, there’s a glimmer of hope when looking into their Q4 guidance. Memory prices are gradually rising, indicating a potential escape from the market’s low point. The most recent financial reports from the world’s top five companies substantiate this positive outlook.

  1. Mixed Results in the Financial Reports of Top 5 Giants

From the recent financial reports of Samsung, SK Hynix, Micron, Kioxia, and Western Digital reveal a slowdown in the rate of revenue loss despite some reporting losses. Some companies express optimism, noting a gradual recovery in certain downstream demand.

Samsung: Anticipating Q4 Demand Recovery

Samsung Electronics’ Q3 financial report shows a revenue of 6.74 trillion Korean won, a YoY decrease, but with a net profit exceeding expectations at 5.5 trillion won.

During their earnings call on October 31, Samsung highlighted the uncertainty in the recovery of the storage chip market. However, they remain optimistic about increased demand in Q4, driven by year-end promotions, new product releases from major clients, and growing demand for generative AI.

SK Hynix: Positive Signs in Market Conditions

SK Hynix’s report for the Q3 2023 fiscal year indicates improving market conditions, particularly due to increased demand for high-performance memory, especially in AI-related products. DRAM and NAND flash memory sales have grown, with a significant 20%  QoQ increase in DRAM shipments. Rise of average prices also impacts the results. In the second half of the year, customers with reduced inventory are progressively increasing their procurement demands, leading to stable developments in product prices.

The company predicts continued improvement in the DRAM market and positive trends in NAND.

Micron: Storage Market Expected to Recover Next Year

Micron’s performance for the Q4 2023 fiscal year shows revenue of $4.01 billion, a 40% year-on-year decrease but better than market expectations. The DRAM business accounts for 69% of revenue, with $2.8 billion in revenue, an increase in bit shipments but a decrease in average selling price. NAND Flash revenue is $1.2 billion, with an increase in bit shipments but a decrease in ASP.

Micron expects Q1 revenue for the 2024 fiscal year to reach $4.2~4.6 billion, anticipating a recovery in the storage market in 2024 and further improvement in 2025.

Kioxia: Rebound in NAND Prices

Kioxia released its financial report for July to September 2023, with revenue of 241.4 billion yen, a 3.9% decrease QoQ and a 38.3% YoY decrease. Due to a decline in demand for smartphone and PC memory chips, the operating loss was 100.8 billion yen in the Q2. However, benefiting from the improvement in storage supply-demand balance, optimized storage portfolio, and the performance of the yen exchange rate, the operating loss has improved.

Although NAND shipments have decreased, the situation has improved due to the rebound in NAND prices. NAND bit shipments decreased by approximately 13%, and NAND ASP increased by about 8%. Looking ahead to 2024, Kioxia expects NAND prices to continue to rise with the original equipment company’s production reduction strategy and customer inventory normalization. Confidence in the NAND market’s recovery is expected, especially in data centers and enterprise SSD demand, after the first half of 2024.

Western Digital: Cloud Market Continues to Grow

Western Digital announced Q1 revenue for the 2024 fiscal year, totaling $2.75 billion, a 3% increase QoQ and a 26% YoY decrease. In the end market, the decline in flash memory prices was offset by the growth in flash memory shipments, driving some business growth on a QoQ basis.

CEO David Goeckeler stated that Q1 performance exceeded expectations, with profit margins for flash memory and HDD business continuously improving. He pointed out that the consumer and end-user markets performed well, and the cloud market is expected to continue growing. With market improvement, an improved cost structure enables the company to increase profitability.

  1. Changing Supply and Demand Dynamics: Some Applications Boosting

Storage companies are adapting to the market by reducing capital expenditures and adjusting inventory, leading to a more normalized market inventory. Simultaneously, increased demand in AI servers, high-performance computing, and automotive intelligence instills confidence in the market.

In the second half of the year, there are clear signs of improvement in the supply and demand dynamics of storage chips. Demand for smartphones, laptops, and new product releases is driving positive trends. Some companies are witnessing strengthened customer demand, even accepting price increases.

In the server sector, AI servers are boosting demand for high-bandwidth memory (HBM), and DDR5 adoption is accelerating. In the automotive storage sector, electric vehicles, intelligence, and networking are propelling in-car storage demand, indicating promising developments in the automotive storage market. Other applications such as big data, cloud computing, and wearable devices related to high-speed storage, reliability, and data security also present growth potential, benefiting storage companies.

  1. Comprehensive Rise in Storage Chips: Is a Turning Point Near?

According to TrendForce, the global NAND Flash market has experienced a comprehensive price increase in the Q4, driven by suppliers’ active production reduction strategies in 2023. Data from TrendForce indicates a general rise in Q4 NAND Flash contract prices, with an increase of about 8-13%.

TrendForce estimates a negative annual growth rate of -2.8% for supply in 2023, the first in several years. This has pushed the overall sufficiency ratio to -3.7%, forming the basis for stabilizing NAND Flash prices in the second half. However, the sustainability of the current upward trend remains unclear due to the lack of substantial terminal demand.

If demand recovers as expected in the second half of 2024, especially with the momentum of AI-related orders for server SSDs and a cautious approach by suppliers in resuming capacity utilization, the overall sufficiency ratio is expected to be controlled at -9.4%, accelerating the balance between supply and demand, and NAND Flash prices may show an upward trend throughout the year.

For DRAM, TrendForce predicts a seasonal increase of about 3-8% in DRAM contract prices in the Q4. The continuation of this upward trend depends on whether suppliers maintain their production reduction strategy and the actual recovery of demand, particularly in the general server.

During the MTS 2024 Storage Industry Trends Seminar, TrendForce highlighted three concerns for the memory market in 2024:

(1) Despite the reduction in inventory levels, it is essential to observe whether this reduction can be sustained and effectively transferred to buyers.

(2) Anticipating a rise in production capacity, an early recovery in operational rates due to market improvements may lead to another imbalance in supply and demand.

(3) Whether the demand from various end-users will align with the expected recovery or not, particularly the sustainability of orders related to AI.
(Image: Samsung)

2023-11-14

[News] H200 Unveiled: NVIDIA Integrates HBM3e for Enhanced AI Performance

On November 13, NVIDIA unveiled the AI computing platform HGX H200, featuring the Hopper architecture, equipped with H200 Tensor Core GPU and high-end memory to handle the vast amounts of data generated by AI and high-performance computing.

This marks an upgrade from the previous generation H100, with a 1.4x increase in memory bandwidth and a 1.8x increase in capacity, enhancing its capabilities for processing intensive generative AI tasks.

The internal memory changes in H200 represent a significant upgrade, as it adopts the HBM3e for the first time. This results in a notable increase in GPU memory bandwidth, soaring from 3.35TB per second in H100 to 4.8TB per second.

The total memory capacity also sees a substantial boost, rising from 80GB in H100 to 141GB. When compared to H100, these enhancements nearly double the inference speed for the Llama 2 model.

H200 is designed to be compatible with systems that already support H100, according to NVIDIA. The company states that cloud service providers can seamlessly integrate H200 into their product portfolios without the need for any modifications.

This implies that NVIDIA’s server manufacturing partners, including ASRock, ASUS, Dell, Eviden, GIGABYTE, HPE, Ingrasys, Lenovo, Quanta Cloud, Supermicro, Wistron, and Wiwynn, have the flexibility to replace existing processors with H200.

The initial shipments of H200 are expected in the second quarter of 2024, with cloud service giants such as Amazon, Google, Microsoft, and Oracle anticipated to be among the first to adopt H200.

What is HBM?

“The integration of faster and more extensive HBM memory serves to accelerate performance across computationally demanding tasks including generative AI models and [high-performance computing] applications while optimizing GPU utilization and efficiency,” said Ian Buck, the Vice President of High-Performance Computing Products at NVIDIA.

What is HBM? HBM refers to stacking DRAM layers like building blocks and encapsulating them through advanced packaging. This approach increases density while maintaining or even reducing the overall volume, leading to improved storage efficiency.

TrendForce reported that the HBM market’s dominant product for 2023 is HBM2e, employed by the NVIDIA A100/A800, AMD MI200, and most CSPs’ (Cloud Service Providers) self-developed accelerator chips.

As the demand for AI accelerator chips evolves, in 2023, the mainstream demand is projected to shift from HBM2e to HBM3, with estimated proportions of approximately 50% and 39%, respectively.

As the production of acceleration chips utilizing HBM3 increases gradually, the market demand in 2024 is expected to significantly transition to HBM3, surpassing HBM2e directly. The estimated proportion for 2024 is around 60%.

Since Manufacturers plan to introduce new HBM3e products in 2024, HBM3 and HBM3e are expected to become mainstream in the market next year.

TrendForce clarifies that the so-called HBM3 in the current market should be subdivided into two categories based on speed. One category includes HBM3 running at speeds between 5.6 to 6.4 Gbps, while the other features the 8 Gbps HBM3e, which also goes by several names including HBM3P, HBM3A, HBM3+, and HBM3 Gen2.

HBM3e will be stacked with 24Gb mono dies, and under the 8-layer (8Hi) foundation, the capacity of a single HBM3e will jump to 24GB.

According to the TrendForce’s previous news release, the three major manufacturers currently leading the HBM competition – SK hynix, Samsung, and Micron – have the following progress updates.

SK hynix and Samsung began their efforts with HBM3, which is used in NVIDIA’s H100/H800 and AMD’s MI300 series products. These two manufacturers are expected to sample HBM3e in Q1 2024 previously. Meanwhile, Micron chose to skip HBM3 and directly develop HBM3e.

However, according to the latest TrendForce survey, as of the end of July this year, Micron has already provided NVIDIA with HBM3e verification, while SK hynix did so in mid-August, and Samsung in early October.

(Image: Nvidia)

 

2023-11-13

[News] YMTC Files Lawsuit Against Micron Alleging Patent Infringement Over 3D NAND Technology Battle

Mainland China’s 3D NAND flash memory manufacturer, Yangtze Memory Technologies Co. (YMTC), filed a lawsuit against the U.S. memory chip leader, Micron Technology, on November 9th in the Northern District Court of California. The lawsuit accuses Micron of infringing upon eight of YMTC’s U.S. patents related to 3D NAND technology.

According to ICsmart, the patents involved in this case from YMTC include US10,950,623 (3D NAND memory device and method of forming the same), US11,501,822 (Non-volatile storage device and control method), US10,658,378 (Through-array contact [TAC] for three-dimensional memory devices), and US10,937,806 (Through-array contact [TAC] for three-dimensional memory devices), US10,861,872 (Three-dimensional memory device and method for forming the same), US11,468,957 (Architecture and method for NAND memory operation), US11,600,342 (Method for reading three-dimensional flash memory), and US10,868,031 (Multiple-stack three-dimensional memory device and fabrication method  thereof).

In the complaint, YMTC alleges that Micron’s 128-layer, 176-layer, and other series of 3D NAND technology have violated eight patents owned by YMTC. Micron is accused of using YMTC’s patented technology without authorization to compete with YMTC, protecting market share and impeding YMTC’s interests, thereby inhibiting innovation.

In recent years, with the stacking of 3D NAND technology reaching 128 layers and even higher, the chip area occupied by peripheral CMOS circuits may exceed 50%. To address this issue, YMTC introduced its proprietary innovative Xtacking technology in 2018.

Established in July 2016 and headquartered in Wuhan, Hubei, YMTC is an IDM (Integrated Device Manufacturer) specializing in the design and manufacturing of 3D NAND flash memory. It also provides comprehensive memory solutions.

Under the shadow of the ongoing US-China tech rivalry, Micron Technology adopted a low-key approach at this year’s Import Expo in Shanghai. During a meeting with Micron’s CEO, Sanjay Mehrotra, Chinese Minister of Commerce Wang Wentao on November 1st welcomed Micron’s continued presence and expansion in the Chinese market, emphasizing the importance of adhering to Chinese laws and regulations for sustainable development. Mr. Mehrotra expressed the company’s willingness to further invest in China.

However, on May 21st this year, China’s Cyberspace Administration announced serious cybersecurity issues with Micron’s products sold in China. These products didn’t pass the review, leading Chinese operators to halt the purchase of Micron’s products. This indicates a potential ban on Micron’s products in the Chinese market.

In October 2022, the US imposed exprt restrictions on advanced chip manufacturing equipment, including placing 36 Chinese companies such as YMTC on an entity list.

(Photo credit: iStock)

2023-11-08

[News] Seizing the AI Trend! Revealing Samsung and Micron’s HBM Expansion Timetable

In a subdued environment for consumer electronic applications in the storage market, High Bandwidth Memory (HBM) technology is emerging as a new driving force, gaining significant attention from major players. Recent reports reveal that both Samsung and Micron are gearing up for substantial HBM production expansion.

Major Manufacturers Actively Investing in HBM

Recent reports indicate that Samsung has acquired certain buildings and equipment within the Cheonan facility of Samsung Display in South Korea to expand its HBM production capacity.

It is reported that Samsung plans to establish a new packaging line at the Cheonan facility for large-scale HBM production. The company has already spent 10.5 billion Korean won on the acquisition of the mentioned buildings and equipment, with an additional investment expected to range between 700 billion and 1 trillion Korean won.

Earlier, it was disclosed by Mr. Hwang Sang-jun, the Vice President of Samsung Electronics and Head of the DRAM Product and Technology Team, that Samsung has developed HBM3E with a speed of 9.8Gbps and plans to commence providing samples to customers.

Concurrently, Samsung is in the process of developing HBM4 with the objective of making it available by 2025. It is reported that Samsung Electronics is actively working on various technologies for HBM4, including non-conductive adhesive film (NCF) assembly techniques optimized for high-temperature thermal characteristics and hybrid bonding (HCB).

On November 6th, Micron Technology opened a new facility in Taichung. Micron has stated that this new facility will integrate advanced testing and packaging functions and will be dedicated to the mass production of HBM3E, along with other products. This expansion aims to meet the increasing demand across various applications such as artificial intelligence, data centers, edge computing, and cloud services.

Previously, Micron’s CEO, Sanjay Mehrotra, revealed that the company plans to commence substantial shipments of HBM3E in early 2024. Micron’s HBM3E technology is currently undergoing certification by NVIDIA. The initial HBM3E offerings will feature an 8-Hi stack design with a capacity of 24GB and a bandwidth exceeding 1.2TB/s.

Furthermore, Micron intends to introduce larger-capacity 36GB 12-Hi stacks HBM3E in 2024. In an earlier statement, Micron had anticipated that the new HBM technology would contribute “hundreds of millions” of dollars in revenue by 2024.

Shift Toward HBM3 Expected in 2024

According to TrendForce, the current mainstream technology in the HBM market is HBM2e. This specification is utilized by prominent players like NVIDIA with their A100 and A800, AMD with the MI200 series, and various custom system-on-chip designs by CSPs.

Simultaneously, in response to the evolving demand for AI accelerator chips, many manufacturers are planning to introduce new products based on HBM3e technology in 2024. It is anticipated that both HBM3 and HBM3e will become the dominant technologies in the market next year, catering to the requirements of AI accelerator chips.

Regarding the demand for different generations of HBM, TrendForce believes that the primary demand is shifting from HBM2e to HBM3 in 2023, with an anticipated demand ratio of approximately 50% and 39%, respectively. As the usage of HBM3-based accelerator chips continues to increase, the market demand is expected to see a substantial shift towards HBM3 in 2024.

It is anticipated that in 2024, HBM3 will surpass HBM2e, with an estimated share of 60%. This transition to HBM3 is expected to be accompanied by higher average selling prices (ASP), significantly boosting next year’s HBM revenue.

Read more

(Photo credit: Samsung)

  • Page 4
  • 13 page(s)
  • 61 result(s)