SK hynix displays prototype of generative AI accelerator card AiMX

Last year, SK Hynix proclaimed their groundbreaking innovation, unveiling a next-generation memory semiconductor technology infused with computational capabilities. The inaugural product birthed from this avant-garde technology was christened GDDR6-AiM (Accelerator-in-Memory), seamlessly melding computational prowess with the alacrity of a 16 Gbps GDDR6.

The prototype AiMX card utilizes multiple GDDR6-AiM chips for enhanced performance

Recently, at the prestigious AI Hardware & Edge AI Summit 2023, SK Hynix unveiled the prototype of their AiMX accelerator card, a magnum opus built upon this memory computation technology. This card, boasting the GDDR6-AiM, is meticulously crafted for generative artificial intelligence paradigms, such as ChatGPT. Further accentuating its potency, SK Hynix orchestrated a stellar demonstration of Meta’s opt13b model on a server system equipped with the AiMX accelerator card. The result? A staggering decimation in data processing time by over tenfold compared to conventional GPU systems, all while sipping a mere fifth of the power.

SK Hynix extolled the virtues of AiMX, heralding it as a solution that transcends traditional GPUs in cost-efficiency, power frugality, and unparalleled performance. The future, as they envision, is a continuous odyssey of pioneering storage technologies befitting the AI renaissance. As a high-velocity, power-efficient storage solution capable of grappling with vast data troves, SK Hynix opines that AiMX will play a pivotal role in the evolution of data-rich generative AI systems. With the performance of generative AI amplifying in tandem with increased data training, the exigency for high-performance products tailored for such systems becomes palpable.

Parallel to SK Hynix’s endeavors, Samsung echoes a congruent sentiment. They envisage AI processors integrated within memory, empowered to execute operations akin to CPUs, GPUs, ASICs, or FPGAs. Samsung’s ambitious roadmap envisions extending this cutting-edge technology, christened PIM (processing-in-memory), across DDR4, DDR5, LPDDR5X, GDDR6, and HBM3 memory architectures. In a monumental revelation at Hot Chips 2023, Samsung divulged their latest research breakthroughs on HBM-PIM and LPDDR-PIM, outfitting the AMD Instinct MI100 with HBM-PIM memory, tailor-made for generative AI applications.