Search Results

Search Results

Keyword


Press Releases
Demand for NVIDIA’s Blackwell Platform Expected to Boost TSMC’s CoWoS Total Capacity by Over 150% in 2024, Says TrendForce

2024/04/16

Semiconductors

However, the GH200 accounted for only approximately 5% of NVIDIA’s high-end GPU shipments. The supply chain has high expectations for the GB200, with projections suggesting that its shipments could exceed millions of units by 2025, potentially making up nearly 40 to 50% of NVIDIA’s high-end GPU market.

Press Releases
Global Server Shipments Expected to Increase by 2.05% in 2024, with AI Servers Accounting For Around 12.1%, Says TrendForce

2024/02/29

Semiconductors

Global server shipments are estimated to reach approximately. 13.654 million units in 2024, an increase of about 2.05% YoY. Meanwhile, the market continues to focus on the deployment of AI servers, with their shipment share estimated at around 12.1%.

Press Releases
High-End AI Server Demand from North America’s Top Four CSPs Expected to Exceed 60% in 2024, Says TrendForce

2024/02/27

Semiconductors

TrendForce’s newest projections spotlight a 2024 landscape where demand for high-end AI servers—powered by NVIDIA, AMD, or other top-tier ASIC chips—will be heavily influenced by North America’s cloud service powerhouses. Microsoft (20.2%), Google (16.6%), AWS (16%), and Meta (10.8%) are predicted to collectively command over 60% of global demand, with NVIDIA GPU-based servers leading the charge.

Press Releases
2024 Commercialization of Copilot to Boost AI Server and AI PC Development, Says TrendForce

2024/01/17

Consumer Electronics , Semiconductors

TrendForce anticipates 2024 to mark a significant expansion in edge AI applications, leveraging the groundwork laid by AI servers and branching into AI PCs and other terminal devices. The global AI server market—encompassing AI Training and AI Inference—is projected to exceed 1.6 million units, growing at an impressive rate of 40%. Additionally, CSPs are expected to ramp up their involvement in this sector.

Press Releases
China Continues to Enhance AI Chip Self-Sufficiency, but High-End AI Chip Development Remains Constrained, Says TrendForce

2023/12/11

Semiconductors

Huawei’s subsidiary HiSilicon has made significant strides in the independent R&D of AI chips, launching the next-gen Ascend 910B. These chips are utilized not only in Huawei's public cloud infrastructure but also sold to other Chinese companies.

Press Releases
NVIDIA Experiences Strong Cloud AI Demand but Faces Challenges in China, with High-End AI Server Shipments Expected to Be Below 4% in 2024, Says TrendForce

2023/11/23

Semiconductors

Despite strong shipments of NVIDIA’s high-end GPUs—and the rapid introduction of compliant products such as the H20, L20, and L2—Chinese cloud operators are still in the testing phase, making substantial revenue contributions to NVIDIA unlikely in Q4. Gradual shipments increases are expected from the first quarter of 2024. 

Press Releases
Strong Cloud AI Server Demand Propels NVIDIA’s FY2Q24 Data Center Business to Surpass 76% for the First Time, Says TrendForce

2023/08/24

Semiconductors

TrendForce believes that the primary driver behind NVIDIA’s robust revenue growth stems from its data center’s AI server-related solutions. Key products include AI-accelerated GPUs and AI server HGX reference architecture, which serve as the foundational AI infrastructure for large data centers.

Press Releases
AI and HPC Demand Set to Boost HBM Volume by Almost 60% in 2023, Says TrendForce

2023/06/28

Semiconductors

TrendForce forecasts that global demand for HBM will experience almost 60% growth annually in 2023, reaching 290 million GB, with a further 30% growth in 2024.

Press Releases
Major CSPs Aggressively Constructing AI Servers and Boosting Demand for AI Chips and HBM, Advanced Packaging Capacity Forecasted to Surge 30~40% by 2024, Says TrendForce

2023/06/21

Semiconductors

TrendForce highlights that to augment the computational efficiency of AI servers and enhance memory transmission bandwidth, leading AI chip makers such as Nvidia, AMD, and Intel have opted to incorporate HBM. Presently, Nvidia’s A100 and H100 chips each boast up to 80 GB of HBM2e and HBM3.