News

[Insights] NVIDIA Maintains Confidence Despite Export Restrictions, Continues Unveiling New Products


2023-12-06 Semiconductors editor

On November 22, 2023, NVIDIA released its financial report for the third quarter of 2023 (FY3Q24: August 2023 to October 2023), with a revenue of USD 18.1 billion. This represents a quarterly increase of 34% and a yearly increase of 206%.

NVIDIA also provided a revenue guidance for the fourth quarter of 2023 (FY4Q24: November 2023 to January 2024), with a median estimate of USD 20 billion. This reflects a quarterly increase of 10.5% and a yearly increase of 231%.

TrendForce’s Insight:

  1. Downgraded Chips Expected by Year-End 

NVIDIA continues its robust performance in the third quarter of 2023 (FY3Q24). The Datacenter division reported a revenue of USD 14.5 billion, marking a 41% quarterly increase and a staggering 279% annual increase. This segment now constitutes 80% of the overall revenue, with a 4% increase from the previous quarter.

The growth is primarily driven by the HGX Hopper GPU, along with the commencement of shipments and revenue recognition for L40S and GH200. Approximately 50% of Datacenter revenue comes from CSP customers, while the remaining 50% is contributed by Consumer Internet and enterprise clients.

In terms of revenue outlook, the China region accounts for approximately 20-25% of NVIDIA’s Datacenter revenue. The management acknowledges the significant impact of the U.S. restrictions on China’s revenue for the fourth quarter of 2023 but expresses confidence that revenue from other regions can offset this impact.

This confidence stems from the current high demand and low supply situation for AI chips. Notably, NVIDIA’s anticipated release of lower-capacity chips, including HGX H20, L20 PCIe, and L2PCIe, originally slated for November 16, 2023, is now expected to be delayed until the end of 2023, presumably due to ongoing negotiations with the U.S. Department of Commerce.

  1. NVIDIA Unveils HGX H200, High-Capacity Version of H100; Next-Gen B100 Anticipates Doubling Performance

In a significant product announcement, NVIDIA introduced HGX H200 on November 14, 2023. The new offering is a high-capacity version of H100 and is fully compatible with HGX H100 systems. This compatibility ensures that manufacturers using H100 don’t need to modify server systems or software specifications when transitioning to H200.

HGX H200 is slated for release in the second quarter of 2024, with initial customers including AWS, Microsoft Azure, Google Cloud, and Oracle Cloud. Additionally, CoreWeave, Lambda, and Vultr are expected to adopt HGX H200 in their setups.

Comparing H200 SXM with H100 SXM, it’s evident that H200 SXM has a 76% increase in memory capacity and a 43% increase in bandwidth compared to H100 SXM. Additionally, it upgrades from HBM3 to HBM3e, while the remaining specifications remain the same. This indicates that H200 SXM is essentially a high-capacity version of H100 SXM.

Given the sensitivity of performance to memory capacity and bandwidth, inference on the Llama2 with 7 billion parameters shows that the performance of H200 SXM can reach 1.9 times that of H100 SXM.

Moreover, NVIDIA plans to launch the B100, truly representing the next generation of products with the new Blackwell architecture in 2024. Utilizing Chiplet technology, it is speculated that the architecture will transition from a single die + 6 HBM3 configuration in H100 to a dual-die + 8 HBM3e configuration in B100, potentially doubling the performance compared to H100.

Read more

(Photo credit: NVIDIA)