In 2025, global AI chips focus on high-end HBM memory; NVIDIA’s new Blackwell platform drives growth, amid geopolitical limits and steady AI server demand, with rapid HBM technology evolution toward HBM4 in 2026.
3Q25 memory prices rise above forecasts as supply tightens. DDR4 surges, new products rise moderately; NAND Flash up only in enterprise.
DDR4 prices rise due to EOL effects and buying demand, focusing on certain players. DDR5 sees slight growth from CSP order returns with steady pricing. AMD's server market penetration grows, benefiting high-speed DRAM and new processes, leading to greater profit potential for manufacturers.
North American CSPs & OEMs drive AI market growth; new Blackwell platform shipments will expand. China's market faces variables due to geopolitics affecting AI solution supply. Overall AI server shipments are expected to maintain double-digit growth.
Cloud giants accelerate self-developed AI ASICs, boosting market scale and projects. Driven by internal needs and geopolitics, ASIC share will rise, with next-gen ASICs from major cloud providers expected to ramp up in 2026, a key growth year.
HBM4 pricing shows a premium over HBM3e due to increased complexity and cost in manufacturing and design.
Amazon is rapidly expanding its self-developed AI ASIC servers to bolster AWS's cloud AI training competitiveness. The new generation Trainium chips feature diversified designs catering to various training needs. On the NVIDIA platform, Delta leads as the main power supplier, advancing high-power AI server development.
The comprehensive quarterly reports detailing market trends, supplier comparisons, and forecast data. It provides the critical analysis on HBM production overview, Spec. outlook, and shipment forecast, etc.
As 2025 progresses, CSPs continue to build out AI infrastructure, with Meta driving growth in the demand for the B300 and GB300...
Global market intelligence firm TrendForce reports that NVIDIA achieved a total revenue of US$44.1 billion for FY1Q26...