Data Storage Revolution: Device Withstands 600°C Heat

Recently, researchers at the University of Pennsylvania developed a novel high-temperature resistant storage device capable of maintaining stability at temperatures up to 600°C, ensuring that data stored on it remains intact. This device addresses the limitations of current silicon-based flash memory, which fails at 200°C, and may be utilized in the memory production for large-scale AI systems.

According to Deep Jariwala, one of the developers, this high-temperature storage device is primarily manufactured using aluminum scandium nitride (AlScN), a material characterized by its stable and robust chemical bonds, which can withstand high temperatures and is highly durable. The storage device itself is constructed with a “metal–insulator–metal” structure, integrating nickel and platinum metal electrodes through a 45nm thick layer of aluminum scandium nitride (AlScN). Jariwala noted that the thickness of the AlScN layer is critical in the manufacturing of this high-temperature storage device. “If it’s too thin, the increased activity can drive diffusion and degrade a material. If too thick, there goes the ferroelectric switching we were looking for, since the switching voltage scales with thickness and there is a limitation to that in practical operating environments. So, my lab and Roy Olsson’s lab worked together for months to find this Goldilocks thickness,” he explained.

Jariwala stated that this high-temperature storage device could address a fundamental flaw in the current chip architecture design, namely the separation of the central processing unit and memory, which leads to inefficiency as data must be transferred between the two. This flaw is particularly impactful for AI applications that process vast amounts of data. The new storage device allows for a tighter integration of memory and processors, reducing data transfer times and thereby enhancing computing speed, complexity, and efficiency—a concept the research team refers to as “memory-augmented computing.” This memory could significantly improve the stability and computational efficiency of future large-scale AI systems.