The global NAND Flash industry continued to benefit from AI infrastructure build-outs in 4Q25, according to TrendForce’s latest research. All in all, the combined revenue of the top five NAND flash suppliers sharply rose 23.8% QoQ to US$21.17 billion.
TrendForce’s latest findings reveal that the expansion of AI applications from LLM training to inference has prompted CSPs to broaden data center build-outs beyond AI servers to include general-purpose servers.
Global CSPs are accelerating investment in AI servers and infrastructure to support expanding AI deployment and upgrades, according to TrendForce’s latest findings on the AI server market. Combined capital expenditures by the world’s eight leading CSPs—Google, AWS, Meta, Microsoft, Oracle, Tencent, Alibaba, and Baidu— are projected to exceed $710 billion in 2026, representing approximately 61% YoY growth.
TrendForce’s latest analysis of the HBM industry reveals that as the ongoing expansion of AI infrastructure continues to fuel GPU demand, NVIDIA’s upcoming Rubin platform is expected to become a major catalyst for HBM4 adoption once mass production begins. The three leading memory suppliers—Samsung, SK hynix, and Micron—are now in the final stages of HBM4 validation, with completion anticipated by 2Q26.
Google’s next-generation TPU, Ironwood, integrates a 3D Torus network topology with the Apollo optical circuit switch (OCS) all-optical network, marking a major step forward in AI data-center interconnect design. TrendForce’s latest research on the high-speed interconnect market indicates that this architecture will directly address the surging compute and bandwidth demands driven by large-scale AI workloads.