HBM


2023-11-08

[News] Seizing the AI Trend! Revealing Samsung and Micron’s HBM Expansion Timetable

In a subdued environment for consumer electronic applications in the storage market, High Bandwidth Memory (HBM) technology is emerging as a new driving force, gaining significant attention from major players. Recent reports reveal that both Samsung and Micron are gearing up for substantial HBM production expansion.

Major Manufacturers Actively Investing in HBM

Recent reports indicate that Samsung has acquired certain buildings and equipment within the Cheonan facility of Samsung Display in South Korea to expand its HBM production capacity.

It is reported that Samsung plans to establish a new packaging line at the Cheonan facility for large-scale HBM production. The company has already spent 10.5 billion Korean won on the acquisition of the mentioned buildings and equipment, with an additional investment expected to range between 700 billion and 1 trillion Korean won.

Earlier, it was disclosed by Mr. Hwang Sang-jun, the Vice President of Samsung Electronics and Head of the DRAM Product and Technology Team, that Samsung has developed HBM3E with a speed of 9.8Gbps and plans to commence providing samples to customers.

Concurrently, Samsung is in the process of developing HBM4 with the objective of making it available by 2025. It is reported that Samsung Electronics is actively working on various technologies for HBM4, including non-conductive adhesive film (NCF) assembly techniques optimized for high-temperature thermal characteristics and hybrid bonding (HCB).

On November 6th, Micron Technology opened a new facility in Taichung. Micron has stated that this new facility will integrate advanced testing and packaging functions and will be dedicated to the mass production of HBM3E, along with other products. This expansion aims to meet the increasing demand across various applications such as artificial intelligence, data centers, edge computing, and cloud services.

Previously, Micron’s CEO, Sanjay Mehrotra, revealed that the company plans to commence substantial shipments of HBM3E in early 2024. Micron’s HBM3E technology is currently undergoing certification by NVIDIA. The initial HBM3E offerings will feature an 8-Hi stack design with a capacity of 24GB and a bandwidth exceeding 1.2TB/s.

Furthermore, Micron intends to introduce larger-capacity 36GB 12-Hi stacks HBM3E in 2024. In an earlier statement, Micron had anticipated that the new HBM technology would contribute “hundreds of millions” of dollars in revenue by 2024.

Shift Toward HBM3 Expected in 2024

According to TrendForce, the current mainstream technology in the HBM market is HBM2e. This specification is utilized by prominent players like NVIDIA with their A100 and A800, AMD with the MI200 series, and various custom system-on-chip designs by CSPs.

Simultaneously, in response to the evolving demand for AI accelerator chips, many manufacturers are planning to introduce new products based on HBM3e technology in 2024. It is anticipated that both HBM3 and HBM3e will become the dominant technologies in the market next year, catering to the requirements of AI accelerator chips.

Regarding the demand for different generations of HBM, TrendForce believes that the primary demand is shifting from HBM2e to HBM3 in 2023, with an anticipated demand ratio of approximately 50% and 39%, respectively. As the usage of HBM3-based accelerator chips continues to increase, the market demand is expected to see a substantial shift towards HBM3 in 2024.

It is anticipated that in 2024, HBM3 will surpass HBM2e, with an estimated share of 60%. This transition to HBM3 is expected to be accompanied by higher average selling prices (ASP), significantly boosting next year’s HBM revenue.

Read more

(Photo credit: Samsung)

2023-11-03

TrendForce Foresees China’s Mature Wafer Processes to Expand to 33% by 2027, Japan Secures Advanced Processes

The research institution TrendForce held its AnnualForecast 2024 Seminar on November 3, where they delved into discussions about global wafer foundry trends, the applications of AI, the dynamics of AI servers, and the demand for High Bandwidth Memory (HBM).

Joanne Chiao, analyst from TrendForce, observed that while AI servers have experienced robust growth over the past two years, AI chips account for just 4% of wafer consumption, limiting their impact on the overall wafer industry. Nevertheless, both advanced and mature processes offer business opportunities. The former benefits from the desire of companies like CSPs to develop customized chips, leading them to seek the assistance of design service providers; while the latter can consider venturing into sector such as power management ICs and I/O solutions.

Persisting US export restrictions continue to affect China’s foundries, causing delays in their expansion plans. Furthermore, the regionalisation of wafer foundry services is exacerbating issues related to uneven resource distribution.

Due to lackluster end-market demand and fierce market competition, the capacity utilization rate of 8-inch wafer foundries continue to decline until the first quarter of the upcoming year. Inventory adjustments are underway in the fields of industrial control and automotive electronics. Chinese foundries are more willing to offer competitive prices, and outperforming their counterparts in Taiwan and Korea in terms of order performance.

In the realm of 12-inch wafer foundry services, success relies on technological leadership and exclusivity. Competition isn’t as intense as it is with 8-inch wafers. This resurgence is driven by inventory replenishment, the demand for iPhone 15, select Android smartphone brands, and the need for AI chips. A moderate recovery is expected in the latter part of this year.

TrendForce indicates that, with the expansion of processes beyond 28nm, mature process capacity is expected to occupy less than 70% of the capacity of the top ten foundries by 2027. Under the pressure to transition towards mature processes, China is anticipated to account for 33% of mature process capacity by 2027, with the possibility of further increases.

It’s noteworthy that Japan is actively promoting the revival of its semiconductor industry and, through incentives for foreign companies establishing fabs, may secure 3% of advanced process capacity.

TrendForce’s analyst, Frank Kung, predicts that the shipment of Nvidia’s high-end GPU processors will exceed 1.5 million units this year, with a YoY growth rate of over 70%, expected to reach 90% by 2024. Starting from the latter half of this year, Nvidia’s high-end GPU market will transition primarily to H100. As for AMD, its high-end AI solutions are mainly targeted at CSPs and supercomputers. The AI server market, equipped with MI300, is expected to experience significant expansion in the latter half of this year.

In the 2023-2024 period, major CSPs are poised to become the primary drivers of AI server demand, with Microsoft, Google, and AWS ranking among the top three. Additionally, the robust demand for cloud-based AI training is expected to propel the growth of advanced AI chips, which may, in turn, stimulate growth in power management or high-speed transmission-related ICs in the future.

Lastly, concerning HBM, TrendForce’s senior research vice president, Avril Wu, mentioned that as Nvidia’s H100 gradually gains momentum, HBM3 is set to become the industry standard in the latter half of this year. With the launch of B100 next year, HBM3e is poised to replace HBM3 as the mainstream memory in the latter half of the following year. Overall, HBM plays a pivotal role in DRAM revenue, with expectations of an increase from 9% in 2023 to 18% in 2024, potentially leading to higher DRAM prices in the coming year.
(Image: TechNews)

2023-10-26

[News] Thanks to AI demand, SK hynix’s Q3 DRAM business turned profitable

SK hynix today reported the financial results for the third quarter ended September 30, 2023. The company recorded revenues of 9.066 trillion won, operating losses of 1.792 trillion won and net losses of 2.185 trillion won in the three-month period. The operating and net margins were a negative 20% and 24%, respectively.

After bottoming out in the first quarter, the business has been on a steady recovery track, helped by growing demand for products such as high-performance memory chips, the company said.

“Revenues grew 24%, while operating losses narrowed 38%, compared with the previous quarter, thanks to strong demand for high-performance mobile flagship products and HBM3, a key product for AI applications, and high-capacity DDR5,” the company said, adding that a turnaround of the DRAM business following two quarters of losses is particularly hopeful.

SK hynix attributed the growth in sales to increased shipments of both DRAM and NAND and a rise in the average selling price.

By products, shipments of DRAM increased 20% from the three months earlier, thanks to strong sales of high-performance products for server applications such as the AI with the average selling price also recording a 10% rise. Shipments of NAND also rose with high-capacity mobile products and solid state drive products taking the lead.

Following a turnaround, an improvement in the DRAM business is forecast to gain speed, backed by popularity of the generative AI technology, while there are looming signs of a steady recovery in the NAND space as well.

With the effect of the production reduction by global memory providers starting to be seen and customers, following efforts to reduce inventories, placing new orders now, semiconductor prices are starting to stabilize, the company said.

To meet new demands, SK hynix plans to increase investments in high-value flagship products such as HBM, DDR5, and LPDDR5. The company will increase the share of the products manufactured from the 1anm and 1bnm, the fourth and the fifth generations of the 10nm process, respectively, while increasing investments in HBM and TSV.

(Image: SK hynix)

2023-10-17

2024 Tech Trends Projection Revealed, TrendForce: AI Continues as the Main Focus

With the approach to the end of 2023, TrendForce revealed the tech trends in every sector, apparently, AI continues as the main focus to decide the direction of how the tech supply chain will be in the next few years, here are the seeings:

CSPs increase AI investment, driving a 38% growth in AI server shipments by 2024

  • CSPs increase AI investment, fueling a 38% growth in AI server shipments by 2024.
  • Major CSPs like Microsoft, Google, and AWS are driving this growth due to the rising popularity of AI applications, pushing AI server shipments to 1.2 million units in 2023.

HBM3e set to drive an annual increase of 172% in HBM revenue

  • Major memory suppliers are set to introduce HBM3e with faster speeds (8 Gbps) to enhance the performance of AI accelerator chips in 2024–2025.
  • HBM integration is becoming common among GPU manufacturers like NVIDIA and AMD, and it is expected that HBM will significantly contribute to memory suppliers’ revenues in 2024, with an annual growth rate of 172%.

Rising demand for advanced packaging in 2024, the emergence of 3D IC technology

  • Leading semiconductor firms like TSMC, Samsung, and Intel are emphasizing advanced packaging technology’s importance in boosting chip performance, conserving space, reduce power usage, and minimize latency. They’re establishing 3D IC research centers in Japan to underscore this role.
  • Generative AI is driving increased demand for 2.5D packaging tech, integrating computing chips and memory with a silicon interposer layer. Additionally, 3D packaging solutions like TSMC’s SoIC, Samsung’s X-Cube, and Intel’s Foveros.

NTN is set to begin with small-scale commercial tests, broader applications of this technology are on the way in 2024

  • Collaboration between satellite operators, semiconductor firms, telecom operators, and smartphone makers is growing due to increased satellite deployments by operators. This collaboration focuses on mobile satellite communication applications and bidirectional data transmission under specific conditions.
  • Major semiconductor manufacturers are ramping up efforts in satellite communication chips, leading top smartphone manufacturers to integrate satellite communication into high-end phones using the SoC model, which is expected to drive small-scale commercial testing of NTN networks and promote widespread adoption of NTN applications.

6G communication to begin in 2024, with satellite communication taking center stage

  • 6G standardization begins around 2024-2025, with initial technologies expected by 2027-2028. This enables novel applications like Reconfigurable Intelligent Surfaces (RIS), terahertz bands, Optical Wireless Communication (OWC), NTN for high-altitude comms, and immersive Extended Reality (XR) experiences.
  • Low-orbit satellites will play a key role in 6G as its standards solidify, peaking around the time of 6G commercialization. The use of drones for 6G communication and environmental sensing is also set to surge in the 6G era.

Innovative entrants drive cost optimization for Micro LED technology in 2024

  • In 2023, the focus in Micro LED display technology is on cost reduction through chip downsizing, aiming for at least a 20-25% annual reduction. A hybrid transfer approach, combining stamping and laser bonding, is gaining attention for efficient mass production.
  • Micro LED holds potential in micro-projection displays for transparent AR lenses. Challenges include achieving ultra-high PPI with 5 µm or smaller chips, particularly with red LEDs’ low efficiency. Various innovative approaches, such as InGan-based red LEDs and vertically stacked RGB LEDs.

Intensifying competition in AR/VR micro-display technologies

  • Increasing AR/VR headset demand drives demand for ultra-high PPI near-eye displays, with Micro OLED technology at the forefront, poised for broader adoption.
  • Challenges in brightness and efficiency impact Micro OLED displays and their dominance in the head-mounted display market depends on the development of various micro-display technologies.

Advancements in material and component technologies are propelling the commercialization of gallium oxide

  • Gallium oxide (Ga₂O₃) is gaining prominence for next-gen power semiconductor devices due to its potential in high-voltage, high-temperature, and high-frequency applications in EVs, electrical grids, and aerospace.
  • The industry is already producing 4-inch gallium oxide mono-crystals and advancing Schottky diode and transistor fabrication processes, with the first Schottky diode products expected by 2024.

Solid-state batteries poised to reshape the EV battery landscape over the next decade

  • Major automakers and battery manufacturers are investing in solid-state and semi-solid-state battery technologies, aiming for a new cycle of technological iteration by 2024.
  • After Li-ion batteries, sodium-ion batteries, with lower energy density, are suitable for budget-friendly EVs, and hydrogen fuel cells offer long-range and zero emissions, primarily for heavy-duty commercial vehicles, with widespread adoption expected after 2025, despite challenges.

BEVs in 2024 rely on power conversion efficiency, driving range, and charging efficiency

  • Automakers are optimizing battery pack structures and material ratios to increase energy density and driving range. Solid-state batteries, with high energy density, may see limited installations in vehicles as semi-solid batteries in 2H23.
  • The 800V platform will enable high-power fast charging, leading to the expansion of high-power charging stations. AI advancements are driving EVs toward advanced autonomous driving, with Tesla’s Dojo supercomputer investing in neural network training to maintain its position in the intelligent driving market.

Green solutions with AI simulations emerging as a linchpin for renewable energy and decarbonized manufacturing

  • Under the background of optimizing energy consumption, creating interconnected data ecosystems, and visualizing energy flow and consumption. Carbon auditing tools and AI are key for organizations aiming to reduce carbon emissions and enhance sustainability.
  • The IEA predicts global renewable energy generation to reach 4,500 GW by 2024, driven by policy support, rising fossil fuel prices, and energy crises. The adoption of AI-driven smart technologies in peripheral systems for stable energy generation.

OLED’s expansion will across various applications driven by the innovation of foldable phones

  • OLED folding phones are improving in design by using lightweight materials, innovative hinge structures, and cost-reduction efforts to approach the thickness and weight of traditional smartphones.
  • In the IT sector, industry players like Samsung, BOE Technology, JDI, and Visionox are making significant investments and developments in OLED technology to expand into various markets. Anticipated advancements in technology and materials are expected to increase OLED market penetration by 2025.
2023-10-12

Continuous Rise in HBM Demand, Memory Giants Expecting HBM4 Delivery in 2025

Amidst the AI boom, HBM technology steps into the spotlight as market demand continues to surge. Global market research firm TrendForce anticipates a 58% year-on-year increase in HBM demand in 2023, with a potential additional growth of approximately 30% in 2024.

Compared to traditional DRAM, HBM (High Bandwidth Memory) boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption. These attributes accelerate AI data processing and make it particularly well-suited for high-performance computing scenarios like ChatGPT. As a result, it has gained popularity, and major storage manufacturers are actively driving HBM technology upgrades.

Leading memory manufacturers are intensifying their efforts, with Samsung set to introduce HBM4.

Since the inception of the first HBM products utilizing TSV packaging technology in 2014, HBM technology has seen multiple upgrades, including HBM, HBM2, HBM2E, HBM3, and HBM3e.

Regarding the SK Hynix and Samsung, two major South Korean companies, have been at the forefront of HBM3 development. NVIDIA’s H100/H800 and AMD’s MI300 series, represent HBM3’s progress. Both SK Hynix and Samsung expected to offer HBM3e samples by the first quarter of 2024. On the other hand, Micron, a U.S.-based memory company, is bypassing HBM3 and directly pursuing HBM3e.

HBM3e will feature 24Gb mono die stacks, and with an 8-layer (8Hi) configuration, a single HBM3e chip’s capacity will soar to 24GB. This advancement is expected to be incorporated into NVIDIA’s GB100 in 2025, leading the three major OEMs to plan HBM3e sample releases in the first quarter of 2024 and enter mass production in the latter half of the year.

In addition to HBM3 and HBM3e, the latest updates indicate that storage giants are planning the launch of the next generation of HBM—HBM4.

Samsung recently announced that it has developed 9.8Gbps HBM3E and is planning to provide samples to customers. Furthermore, Samsung is actively working on HBM4 with a goal to begin supply in 2025. It’s reported that Samsung Electronics is developing technologies such as non-conductive adhesive film (NCF) assembly for optimizing high-temperature thermal characteristics, as well as hybrid bonding (HCB), for HBM4 products.

In September, Korean media reported that Samsung is gearing up to revamp its production process and launch HBM4 products to capture the rapidly growing HBM market. HBM4 memory stacks will feature a 2048-bit memory interface, a significant departure from the previous 1024-bit interface for all HBM stacks. This enhanced interface width holds great significance for the evolution of HBM4.

While HBM4 promises a major breakthrough, it is still a ways off, making it too early to discuss its practical applications and widespread adoption. Industry experts emphasize that the current HBM market is dominated by HBM2e. However, HBM3 and HBM3e are poised to take the lead in the near future.

According to TrendForce’s research, HBM2e currently accounts for the mainstream market share, being used in various products like NVIDIA A100/A800, AMD MI200, and many AI accelerators developed by CSPs. To keep pace with the evolving demands of AI accelerator chips, OEMs are planning to introduce new HBM3e products in 2024, with HBM3 and HBM3e expected to become the market’s primary players next year.

In terms of the demand transition between different HBM generations, TrendForce estimates that in 2023, mainstream demand will shift from HBM2e to HBM3, with estimated demand shares of approximately 50% and 39%, respectively. As more HBM3-based accelerator chips enter the market, demand will substantially shift toward HBM3 in 2024, surpassing HBM2e and accounting for an estimated 60% of the market. This transition, coupled with higher average selling prices (ASP), is poised to significantly drive HBM revenue growth next year.

(Photo credit: Samsung)

  • Page 3
  • 6 page(s)
  • 26 result(s)