Samsung Electronics and SK Hynix have been receiving a surge in orders for HBMs

Recently, due to the sluggish semiconductor situation, DRAM prices have been falling all the way. Major memory chip manufacturers such as Samsung and SK Hynix are miserable. The financial statements of the latter in the fourth quarter of 2022 show that they even suffered a single quarterly loss again after ten years.

However, ChatGPT has recently set off a wave of enthusiasm around the world. After the launch of this artificial intelligence tool, it has attracted the attention of many people. Nvidia founder and CEO Jensen Huang said in a recent speech that ChatGPT is the iPhone moment in the field of artificial intelligence and one of the greatest technologies ever created in the field of computing.

In fact, the emergence of ChatGPT seems to be a turning point for memory manufacturers. According to Business Korea, since the beginning of this year, Samsung and SK Hynix’s HBM (high bandwidth memory) orders have surged, and prices have also risen, injecting a little vitality into the sluggish DRAM market.

Compared with ordinary DRAM, HBM achieves higher bandwidth through stacking, working in conjunction with CPU and GPU, machine learning and computing performance can be improved. . In addition to Nvidia and AMD computing cards, Intel recently released the first x86 processor Xeon Max that uses HBM memory, which also uses HBM2e, and more and more high-performance chips involving AI choose HBM.

Although HBM has stronger performance, its shipment volume is much less than that of ordinary DRAM. This is because its average selling price is three times that of ordinary DRAM, which requires complex production processes and advanced production technologies. With the emergence of ChatGPT, AI has become a hot spot again, and the demand for high-performance computing chips has increased, which has also benefited HBM. Nvidia’s latest H100 computing card uses the latest HBM3. It is understood that compared with the highest-performance DRAM, the current price of HBM3 has increased by up to five times.

In addition, memory manufacturers also integrate AI engines on the basis of the original HBM. Previously, Samsung launched the HBM-PIM (Aquabolt-XL) chip, which can provide up to 1.2 TFLOPS of embedded computing power, so that the memory chip itself can perform CPU, GPU, ASIC, or FPGA operations. By injecting an AI processor inside each memory module, thus offloading the processing operations to the HBM itself, which reduces the burden of transferring data between the memory and the general CPU, and will be extended to DDR4, DDR5, LPDDR5X, GDDR6, and HBM3 memory in the future. SK hynix also announced last year that it has developed GDDR6-AiM (Accelerator-in-Memory) with computing functions, adding computing functions to GDDR6 at a rate of 16 Gbps.

Some people in the industry said that the era when memory manufacturers only relied on manufacturing processes has passed, and the development of AI semiconductor technology that adds data processing capabilities will become very important and may determine the future of chip manufacturers. In the medium to long term, the development of dedicated DRAM chips such as HBM-PIM will bring great changes to the semiconductor industry.