Micron, SK hynix, and Samsung Submit HBM3e Samples to NVIDIA, Accelerating Development
A few days ago, NVIDIA announced its financial report for the third quarter of fiscal year 2024 (ending October 29, 2023), in which its data center business once again emerged as a highlight. The revenue stood at $14.51 billion, substantially surpassing last year’s $3.8 billion and exceeding market expectations of $12.7 billion. This represents a staggering year-over-year growth of 324% and a sequential increase of 38%. According to Omdia’s statistics, NVIDIA approximately sold 500,000 A100 and H100 compute cards in the third quarter of 2023.
Historically, NVIDIA’s cornerstone businesses have been gaming and data centers. However, the latter now significantly outpaces the former in revenue, evidently becoming NVIDIA’s focal point for the upcoming years. Currently, NVIDIA’s high-performance compute cards require a large number of HBM-type chips. For more efficient and comprehensive supply chain management, NVIDIA plans to incorporate additional suppliers. Research by TrendForce indicates that Samsung’s HBM3 (24GB) is expected to complete validation with NVIDIA in December this year.
TrendForce’s compilation reveals that Micron provided samples of its eight-layer vertically stacked HBM3 Gen2 (24GB) in late July this year, followed by SK Hynix with its eight-layer vertically stacked HBM3E (24GB) samples in mid-August, and Samsung offered its version in early October.
Due to the intricate nature of HBM validation, typically spanning two quarters, the earliest validation results are expected by the end of this year, with all three manufacturers completing the process in the first quarter of 2024.
Notably, NVIDIA’s validation results will significantly influence the allocation of procurement weights in 2024, substantially impacting the orders for major manufacturers. Next year, NVIDIA’s compute card product line will become more nuanced, introducing the H200 with six HBM3E chips and the B100 with eight HBM3E chips. Concurrently, NVIDIA will integrate its own CPU and GPU products based on the Arm architecture, launching the GH200 and GB200.
Furthermore, HBM4 is slated for release in 2026. With clients demanding higher computational efficiency, HBM4 will evolve beyond the current 12-layer vertical stacking to 16 layers (planned for 2027), likely spurring demand for new stacking methods like hybrid bonding. Future HBM developments will lean towards greater customization, not only positioned beside the SoC main chip but also transitioning to stack above it.