TrendForce has revised up its server shipment forecast for 2025 to nearly 7% YoY growth, driven by CSPs' expansion of cloud data centers and AI demand. North American CSPs are increasing orders for general-purpose servers, while Chinese CSPs are actively procuring AI servers. As a result, server ODM's motherboard SMT line utilization rates have risen significantly. The development of technologies like DeepSeek will further drive the growth of the AI server market.
Inside DeepSeek
AI Software Ecosystem
AI Hardware Ecosystem
Implications & Impact
Opportunities
On January 30, 2025, during Tesla’s Q4 2024 earnings call, the company revealed that its humanoid robot production line is expected to reach a monthly output of 10,000 units by 2026, with deliveries set to begin that same year. Initially, the first production line will have a capacity of 1,000 units per month, which will then be scaled up to 10,000 units per month. Tesla anticipates releasing the next-generation version of Optimus in H1 2026, with market sales commencing in H2 2026. Despite the high price, Elon Musk remains confident in strong demand. Once mass production reaches 1 million units annually, Tesla expects the production cost of Optimus to fall below $20,000 per unit, though the final price will depend on market feedback. On February 6, Tesla posted several job openings for the robot project at their Fremont, California facility. According to Tesla’s official website, at least 12 roles are currently available, including positions for manufacturing engineers, production managers, and process supervisors. TrendForce notes that this move aligns with Tesla’s goal of mass-producing robots by 2025, signaling the company is accelerating its humanoid robot production plans.
According to research by global market research firm TrendForce, AI has experienced rapid development in recent years, with enterprises actively exploring ways to use related technologies to...
TrendForce analyzes that DeepSeek reduces AI deployment costs through optimized algorithms, creating new opportunities for SMEs. While the short-term impact on the memory market is limited, in the long term, DeepSeek will drive the popularization of edge AI applications, increasing DRAM capacity in PCs and smartphones, and boosting demand for high-performance NAND Flash like UFS 4.0. Concurrently, DeepSeek may reduce demand for high-end GPUs and HBM, but will increase demand for general-purpose server DRAM and enterprise SSDs.
According to the latest research by TrendForce, the NAND Flash market is still experiencing oversupply challenges in 1Q25, resulting in ongoing price declines...
TrendForce forecasts that despite continued oversupply in 1Q25, the NAND Flash market will see significant improvement in supply-demand dynamics in 2H25, leading to a price rebound. This is driven by suppliers' proactive production cuts, growing AI-related demand, and faster inventory depletion. Chinese government subsidies, AI server demand, and AI PC/smartphone demand spurred by DeepSeek will contribute to the NAND Flash market recovery.
DeepSeek’s rapid rise not only highlights the role of algorithm optimization in balancing model performance and hardware efficiency but also signals a shift in AI development toward inference-driven applications. AI agents are expected to become the primary form of such applications, gradually evolving into systematic Agentic AI, enabling greater automation and goal-oriented task execution.
According to the latest supply chain survey by TrendForce, CSPs, OEMs, and other enterprises maintained strong growth momentum for AI servers in 2024...
TrendForce forecasts uncertain growth for AI server shipments in 2025, influenced by US export controls, the rise of DeepSeek, and GB rack preparation progress. The base case projects 28% growth; the worse case, 20-25%; and the bullish case, nearly 35%. While DeepSeek may reduce demand for high-end AI training servers, it will drive edge AI inference server development and encourage CSP investment in custom ASICs, benefiting long-term AI market growth. AI inference is expected to gradually approach a 50% market share.