GPU


2023-07-31

High-Tech PCB Manufacturers Poised to Gain from Remarkable Increase in AI Server PCB Revenue

Looking at the impact of AI server development on the PCB industry, mainstream AI servers, compared to general servers, incorporate 4 to 8 GPUs. Due to the need for high-frequency and high-speed data transmission, the number of PCB layers increases, and there’s an upgrade in the adoption of CCL grade as well. This surge in GPU integration drives the AI server PCB output value to surpass that of general servers by several times. However, this advancement also brings about higher technological barriers, presenting an opportunity for high-tech PCB manufacturers to benefit.

TrendForce’s perspective: 

  • The increased value of AI server PCBs primarily comes from GPU boards.

Taking the NVIDIA DGX A100 as an example, its PCB can be divided into CPU boards, GPU boards, and accessory boards. The overall value of the PCB is about 5 to 6 times higher than that of a general server, with approximately 94% of the incremental value attributed to the GPU boards. This is mainly due to the fact that general servers typically do not include GPUs, while the NVIDIA DGX A100 is equipped with 8 GPUs.

Further analysis reveals that CPU boards, which consist of CPU boards, CPU mainboards, and functional accessory boards, make up about 20% of the overall AI server PCB value. On the other hand, GPU boards, including GPU boards, NV Switch, OAM (OCP Accelerator Module), and UBB (Unit Baseboard), account for around 79% of the total AI server PCB value. Accessory boards, composed of components such as power supplies, HDD, and cooling systems, contribute to only about 1% of the overall AI server PCB value.

  • The technological barriers of AI servers are rising, leading to a decrease in the number of suppliers.

Since AI servers require multiple card interconnections with more extensive and denser wiring compared to general servers, and AI GPUs have more pins and an increased number of memory chips, GPU board assemblies may reach 20 layers or more. With the increase in the number of layers, the yield rate decreases.

Additionally, due to the demand for high-frequency and high-speed transmission, CCL materials have evolved from Low Loss grade to Ultra Low Loss grade. As the technological barriers rise, the number of manufacturers capable of entering the AI server supply chain also decreases.

Currently, the suppliers for CPU boards in AI servers include Ibiden, AT&S, Shinko, and Unimicron, while the mainboard PCB suppliers consist of GCE and Tripod. For GPU boards, Ibiden serves as the supplier, and for OAM PCBs, Unimicron and Zhending are the suppliers, with GCE, ACCL, and Tripod currently undergoing certification. The CCL suppliers include EMC. For UBB PCBs, the suppliers are GCE, WUS, and ACCL, with TUC and Panasonic being the CCL suppliers.

Regarding ABF boards, Taiwanese manufacturers have not yet obtained orders for NVIDIA AI GPUs. The main reason for this is the limited production volume of NVIDIA AI GPUs, with an estimated output of only about 1.5 million units in 2023. Additionally, Ibiden’s yield rate for ABF boards with 16 layers or more is approximately 10% to 20% higher than that of Taiwanese manufacturers. However, with TSMC’s continuous expansion of CoWoS capacity, it is expected that the production volume of NVIDIA AI GPUs will reach over 2.7 million units in 2024, and Taiwanese ABF board manufacturers are likely to gain a low single-digit percentage market share.

(Photo credit: Google)

2023-06-14

AI Servers: The Savior of the Supply Chain, Examining Key Industries

NVIDIA’s robust financial report reveals the true impact of AI on the technology industry, particularly in the AI server supply chain.

2023-06-02

Can MediaTek and NVIDIA Collaborate on Smartphone Chips?

Recently, there has been news of collaboration between NVIDIA and MediaTek. Speculation suggests that the future collaboration may extend to smartphone SoCs, allowing MediaTek to enhance the graphical computing and AI performance of Dimensity smartphone SoCs through NVIDIA’s GPU technology licensing.

Currently, the focus of this collaboration is primarily on NB SoC development, with some progress in the automotive-related chip sector. As for the scope of smartphone SoC collaboration, it is still under discussion, but the potential for related partnerships is worth noting.

In the announced collaboration between NVIDIA and MediaTek for the NB SoC products, MediaTek is mainly responsible for CPU, while other part such as GPU, DSP, ISP, and interface IP are provided by NVIDIA or external partners. NVIDIA holds the leadership position, while MediaTek plays a supporting role in this collaboration.

Regarding the industry’s speculation about possible collaboration in smartphone SoC development, it is estimated that MediaTek will take the lead in the design. Therefore, it is necessary to explore the motivations behind MediaTek’s adoption of related technologies.

Firstly, since the era of the Arm V9 instruction set, Arm’s reference GPU, Immortalis, has incorporated ray tracing functionality, assisting MediaTek’s flagship SoCs in improving gaming performance. This indicates that optimizing gaming scenarios is a key development focus for SoC manufacturers.

However, for high-end gaming applications, the current GPU performance of smartphone SoCs still cannot maintain high frame rates and native resolutions during gameplay. While selecting a pure core stacking approach to improve computational power is effective, it puts pressure on device power consumption. In light of this, Qualcomm introduced Snapdragon Game Super Resolution (GSR) technology this year, which simultaneously reduces power consumption and enhances game graphics quality. MediaTek has not yet explored this technology, and Arm Immortalis has not been released. Therefore, when it comes to GPU performance computing, MediaTek has incentives to seek external collaborations.

Furthermore, with the rapid upgrading of GPUs on smartphone SoCs, PC-level games are now being introduced to smartphones, and industry players are promoting compatibility with graphics APIs, opening doors for NVIDIA, AMD, and even Intel to enter the mobile gaming market. Samsung has partnered with AMD for its Exynos SoC GPU, while NVIDIA, with similar technology to Qualcomm Snapdragon GSR, becomes a logical choice as a cooperation partner for MediaTek.

TrendForce believes that if MediaTek integrates NVIDIA GPUs into Dimensity SoCs and leverages TSMC’s process power efficiency advantages, it could bring a new wave of excitement to MediaTek in the flagship or gaming device market, attracting consumer interest. However, despite the potential technical benefits of collaboration, considering the influence of geopolitical factors, MediaTek, which primarily sells its smartphone SoCs to Chinese customers, may ultimately abandon this collaboration option due to related policy risks.

2023-05-25

Server Specification Upgrade: A Bountiful Blue Ocean for ABF Substrates

ChatGPT’s debut has sparked a thrilling spec upgrade in the server market, which has breathed new life into the supply chain and unlocked unparalleled business opportunities. Amidst all this, the big winners look set to be the suppliers of ABF (Ajinomoto Build-up Film) substrates, who are poised to reap enormous benefits.

In the previous article, “AI Sparks a Revolution Up In the Cloud,” we explored how the surge in data volumes is driving the spec of AI servers as well as the cost issue that comes with it. This time around, we’ll take a closer look at the crucial GPU and CPU platforms, focusing on how they can transform the ABF substrate market.

NVIDIA’s Dual-Track AI Server Chip Strategy Fuels ABF Consumption

In response to the vast data demands of fast-evolving AI servers, NVIDIA is leading the pack in defining the industry-standard specs.

This contrasts with standard GPU servers, where one CPU backs 2 to 6 GPUs. Instead, NVIDIA’s AI servers, geared towards DL(Deep Learning) and ML(Machine Learning), typically support 2 CPUs and 4 to 8 GPUs, thus doubling the ABF substrate usage compared to conventional GPU servers.

NVIDIA has devised a dual-track chip strategy, tailoring their offerings for international and Chinese markets. The primary chip for ChatGPT is NVIDIA’s A100. However, for China, in line with U.S. export regulations, they’ve introduced the A800 chip, reducing interconnect speeds from 600GBps (as on the A100) to 400GBps.

Their latest H100 GPU chip, manufactured at TSMC’s 4nm process, boasts an AI training performance 9 times greater than its A100 predecessor and inferencing power that’s 30 times higher. To match the new H100, H800 was also released with an interconnect speed capped at 300GBps. Notably, Baidu’s pioneering AI model, Wenxin, employs the A800 chip.

To stay competitive globally in AI, Chinese manufacturers are expected to aim for the computational prowess on par with the H100 and A100 by integrating more A800 and H800 chips. This move will boost the overall ABF substrate consumption.

With the ChatBot boom, it is predicted a 38.4% YoY increase in 2023’s AI server shipments and a robust CAGR of 22% from 2022 to 2026 – significantly outpacing the typical single-digit server growth, according to TrendForce’s prediction.

AMD, Intel Server Platforms Drive ABF Substrate Demand

Meanwhile, examining AMD and Intel’s high-end server platforms, we can observe how spec upgrades are propelling ABF substrate consumption forward.

  • AMD Zen 4:

Since 2019, AMD’s EPYC Zen 2 server processors have used Chiplet multi-chip packaging, which due to its higher conductivity and cooling demands, has consistently bolstered ABF substrate demand.

  • Intel Eagle Stream:

Intel’s advanced Eagle Stream Sapphire Rapids platform boasts 40-50% higher computation speed than its predecessor, the Whitley, and supports PCIe5, which triggers a 20% uptick in substrate layers. This platform employs Intel’s 2.5D EMIB tech and Silicon Bridge, integrating various chips to minimize signal transmission time.

The Sapphire Rapids lineup includes SPR XCC and the more advanced SPR HBM, with the latter’s ABF substrate area being 30% larger than the previous generation’s. The incorporation of EMIB’s Silicon Bridge within the ABF substrate increases lamination complexity and reduces overall yield. Simply put, for every 1% increase in Eagle Stream’s server market penetration, ABF substrate demand is projected to rise by 2%.

As the upgrades for server-grade ABF substrates continue to advance, production complexity, layer count, and area all increase correspondingly. This implies that the average yield rate might decrease from 60-70% to 40-50%. Therefore, the actual ABF substrate capacity required for future server CPU platforms will likely be more than double that of previous generations.

ABF Substrate Suppliers Riding the Tide

By our estimates, the global ABF substrate market size is set to grow from $9.3 billion in 2023 to $15 billion in 2026 – a CAGR of 17%, underscoring the tremendous growth and ongoing investment potential in the ABF supply chain.

Currently, Taiwanese and Japanese manufacturers cover about 80% of the global ABF substrate capacity. Major players like Japan’s Ibiden, Shinko and AT&S, along with Taiwan’s Unimicron, Nan Ya, and Kinsus all consider expanding their ABF substrate production capabilities as a long-term strategy.

As we analyzed in another piece, “Chiplet Design: A Real Game-Changer for Substrates,” despite the recent economic headwinds, capacity expansion of ABF substrate can still be seen as a solid trend, which is secured by the robust growth of high-end servers. Hence, the ability to precisely forecast capacity needs and simultaneously improve production yields will be the key to competitiveness for all substrate suppliers.

Read more:

(Photo Credit: Google)

2022-11-14

Scale of Global Market for ABF Substrates Will Grow at a CAGR of 16.4% in 2022~2026 Period; US Export Restrictions Will Influence Supply-Demand Dynamics of This Material

Shipments of CPUs, GPUs, and chipsets have been falling due to the weakening demand for PCs, gaming devices, and cryptocurrency mining machines. This recent development has also constrained the growth of the market for ABF substrates. Currently, the demand situation for this material is exhibiting signs of uncertainty.

Regarding the distribution of the demand for ABF substrates, applications that are driving growth are cloud services, AI, and automotive electronics. CPUs, GPUs, FPGAs, and switch ICs are chips that are deployed in servers purposed for a wide range of applications related to cloud services and endpoint AI technologies. Meanwhile, other AI-related applications require high-end ASICs. At the same time, more and more high-end SoCs and MCUs are embedded in vehicles. All in all, these aforementioned applications will spur the demand for ABF substrates. Additionally, package size continues to increase for high-performance ICs. This trend, too, will sustain the demand for ABF substrates over the long haul. By contrast, the PC market has matured, so the related demand is shrinking. From a long-term perspective, the influence of the PC market on the demand for ABF substrates will gradually wane.

TrendForce forecasts that the scale of the global market for ABF substrates will expand from US$9.3 billion in 2022 to US$17.1 billion in 2026, thus showing a CAGR of 16.4%. Due to the influence of the US technology export restrictions against China, the demand for ICs purposed for HPC will be higher than expected for the period from 4Q22 to 3Q23. This, in turn, will also further raise the demand for ABF substrates.

Then, starting from 4Q23, exports of HPC chips to China will start to slow down. However, demand will continue to grow for ASICs, AI chips, SoCs, and MCUs at that time. The growth in these application segments will offset some of the negative effect of the US export restrictions on the market for ABF substrates. In terms of the supply-demand dynamics of ABF substrates, a balance will gradually be attained in 2024. However, demand will get stronger in 2025 and 2026, so supply could tighten during that two-year period.

  • Page 2
  • 3 page(s)
  • 11 result(s)