Micron Gains Ground in HBM3E Race

Influenced by the burgeoning fields of Artificial Intelligence (AI) and High-Performance Computing (HPC), the development of High Bandwidth Memory (HBM) products has accelerated in the past two years, concurrently driving revenue growth for memory manufacturers. As a leading partner in high-bandwidth memory for Nvidia, SK Hynix currently dominates the HBM market, supplying a significant amount of HBM3 for Nvidia’s various AI chips.

According to Korea JoongAng Daily, Micron Technology’s early acquisition of Nvidia’s HBM3E orders for the H200, taking a proactive stance in recent competitions, is attributed to its technological superiority in manufacturing processes. SK Hynix commands a 54% share of the HBM market, whereas Micron holds only 10%. However, with the supply agreement with Nvidia in hand, the situation may change rapidly.

Unlike SK Hynix’s significant lead in HBM3, the scenario seems to have shifted with HBM3E, with the entry of Micron and Samsung intensifying market competition. Last year, Micron, SK Hynix, and Samsung sequentially sent HBM3E samples to Nvidia for qualification tests of the next-generation AI GPUs. This move by Nvidia to incorporate more suppliers underscores its strategy to ensure the supply of upcoming products.

In July of the previous year, Micron launched the industry’s first HBM3E with a bandwidth exceeding 1.2TB/s, pin speed over 9.2GB/s, and an 8-layer stack configuration of 24GB capacity, manufactured using the 1β (1-beta) process. Micron later revealed that its HBM3E samples outperformed expectations in terms of performance and power efficiency, surpassing its competitors and astonishing its clients. Micron also prepared a 12-layer stack configuration of 36GB capacity HBM3E, which increases the capacity by 50% within the given stack height.

As the frontrunner in the HBM market, SK Hynix is not resting on its laurels, having recently sent new 12-layer stack HBM3E samples to Nvidia for product validation tests. Samsung also officially announced a similar product, a 12-layer stack HBM3E with 36GB capacity. The next generation of HBM4 is anticipated to arrive by 2026, signaling an even more competitive HBM market landscape.