Samsung demonstrated the latest research results of HBM-PIM and LPDDR-PIM

At this year’s distinguished Hot Chips 2023 symposium, Samsung Electronics unveiled its cutting-edge research findings on HBM-PIM and LPDDR-PIM. Samsung delineated these memory modules as the impending generation of storage, specifically tailored for the burgeoning artificial intelligence industry.

Prior to this revelation, Samsung Electronics and AMD had already embarked upon collaborative ventures in PIM technology. Samsung had endowed AMD’s commercial GPU accelerator card, the MI-100, with HBM-PIM memory, harnessing it for generative artificial intelligence endeavors. Samsung’s investigative findings evince that, juxtaposed with the extant HBM, GPUs fortified with HBM-PIM have seen their performance and energy efficiency burgeon by over a hundred percent. Moreover, in a bid to authenticate the MOE (Mixed Expert System) model, Samsung orchestrated an HBM-PIM cluster, utilizing 96 MI-100 GPUs replete with HBM-PIM. Within the MOE paradigm, in contrast to traditional HBM, the HBM-PIM GPU’s performance doubled whilst its energy efficiency was amplified threefold.

To counteract the emergent memory bottleneck in the artificial intelligence milieu in recent years, next-gen memory solutions like HBM-PIM have garnered widespread acclaim. The novel HBM-PIM (processing-in-memory) deftly integrates an AI processor within each memory module, thereby shifting processing operations directly to the HBM, alleviating the oft-onerous task of data transference between the memory and the primary processor. This strategic pivot bolsters both performance and energy efficiency.

Beyond the precincts of HBM-PIM, Samsung Electronics also showcased LPDDR-PIM, a harmonious fusion of mobile DRAM with PIM, facilitating direct data processing and computation within mobile devices. Given its design for mobile-centric applications, its bandwidth, at 102.4GB/s, is relatively conservative. However, its power consumption is remarkably frugal, marking a 72% conservation.