Samsung has been verified by AMD, HBM3 will be supplied to Instinct MI300 series

In the realm of high-bandwidth memory (HBM) markets in 2024, HBM3 remains the prevailing standard, although Nvidia’s forthcoming H200 and B100 are set to transition to HBM3E. The soaring demand for artificial intelligence (AI) has perpetuated a strained supply situation for Nvidia and other vendors, where not only is the Chip-on-Wafer-on-Substrate (CoWoS) packaging a bottleneck in production capacity, but HBM has also increasingly become a constraint in supply. Compared to conventional DRAM, HBM production cycles are lengthier, necessitating over two quarters from wafer start to final packaging.

TrendForce’s deputy director, Avril Wu, has articulated that SK Hynix is the principal supplier of HBM3, yet its supply falls short of meeting the expansive demand of the AI market. By the end of 2023, Samsung joined Nvidia’s supply chain with products utilizing the 1Z nm process, albeit in a minimal capacity, marking Samsung’s inaugural order in HBM3.

Samsung, a longstanding strategic supplier to AMD, has had its HBM3 pass verification for the Instinct MI300 series in the first quarter of 2024, including both 8-layer and 12-layer stacked products. Samsung is expected to gradually increase its supply volume in the subsequent quarter, aiming to catch up with SK Hynix.

As the year progresses, the market’s focus will shift from HBM3 to HBM3E, with the latter’s supply volume gradually increasing to become the mainstream product in the HBM market. Both SK Hynix and Micron have passed Nvidia’s validation for use in the H200 by the end of the second quarter of 2024. Samsung, having not yet passed Nvidia’s validation, is anticipated to complete this process by the end of the first quarter of 2024, with supply starting in the second quarter.

With Samsung and Micron ramping up HBM shipments, the market is poised to move beyond SK Hynix’s dominance, heralding a more competitive landscape.