Micron Introduces HBM3 Gen2: 24GB capacity, over 1.2TB/s bandwidth

Micron announces the introduction of the industry’s premier HBM3 Gen2 with a bandwidth exceeding 1.2TB/s, pin speeds surpassing 9.2GB/s, and an 8-layer vertical stack of 24GB. This represents a 50% enhancement over the current HBM3 solutions in circulation.

Micron avers that the HBM3 Gen2 product offers a 2.5-fold performance-per-watt ratio over previous generations, setting new standards for key parameters such as performance, capacity, and power efficiency in AI data centers. This development can reduce the training time for large language models such as GPT-4 and offer an excellent total cost of ownership (TCO). At the heart of Micron’s HBM3 Gen2 solution is its 1β (1-beta) process, forming an 8-layer vertical stack cube of 24GB DRAM chips within the industry-standard package size.

Moreover, Micron is also preparing a 12-layer vertical stack single 36GB DRAM chip. Compared to existing competitor solutions, Micron’s offering increases capacity by 50% at a given stack height. Technological advances by Micron have enabled improved energy efficiency. Through silicon via (TSV) technology has doubled, and metal density has increased five-fold, reducing thermal impedance and introducing energy-efficient data path designs.

According to Micron’s latest technology roadmap, the “HBMNext” in 2026 will further increase capacity to 36GB-64GB and raise the bandwidth beyond 1.5TB/s to over 2+TB/s. In addition, Micron will bring GDDR7 next year, with capacities ranging from 16Gb to 24Gb and data I/O interface rates of 32Gbps, on par with Samsung’s recently released first GDDR7.