Samsung launches HBM3E “Shinebolt” for the next generation of AI applications
Recently, Samsung orchestrated the ‘Samsung Memory Tech Day 2023’, unveiling a myriad of pioneering technologies and products poised to herald the era of large-scale artificial intelligence (AI). Among the showcased innovations were the HBM3E ‘Shinebolt’, LPDDR5X CAMM2, and the modular AutoSSD, all designed to accelerate technological advancements across the cloud, edge devices, and the automotive applications of the future.
Samsung introduced the next-generation HBM3E DRAM, christened ‘Shinebolt’, tailored for burgeoning AI applications. This innovation seeks to enhance the Total Cost of Ownership (TCO) and expedite AI model training and inference speeds within data centers.
Each pin of the HBM3E boasts a velocity of 9.8GB/s, culminating in an aggregate bandwidth surpassing 1.2TB/s. Endeavoring to achieve elevated stacking tiers and enhanced thermal attributes, Samsung refined its Non-Conductive Film (NCF) technology to eradicate interstices between chip layers, thereby optimizing thermal conductivity. Mass production has commenced for the 8-layer and 12-layer stacked HBM3, and samples of HBM3E are presently in transit to clientele. Samsung also aspires to leverage its comprehensive semiconductor solution provider stature, proffering bespoke services that meld next-generation HBM, avant-garde encapsulation techniques, and foundry products.
Beyond the HBM3E, other luminous products showcased during this assembly encompassed the industry’s highest-capacity 32Gb DDR5 DRAM, the industry’s inaugural 32Gbps GDDR7, and the PB-level PBSSD to substantially augment server application storage capacities. Additionally, on display were the 7.5Gbps LPDDR5X CAMM21, the 9.6Gbps LPDDR5X DRAM, the LLW2 DRAM specifically devised for on-device AI, the next-generation UFS, and the QLC SSD BM9C1 tailored for personal computers.