Nvidia AI GPU demand surges

Over the past few months, the fever for artificial intelligence tools like ChatGPT has been spreading globally, propelling an increasing number of tech firms to commit to related research and development. Multimodal large-scale language models necessitate substantial computational support, turning Nvidia’s GPUs, including the A100 and H100 compute cards, into hotly pursued commodities to such an extent that supply may struggle to keep pace promptly.

Over a decade ago, Nvidia had bet on artificial intelligence as the next booming sector, and it seems years of substantial investment are finally reaping sizeable returns. According to Digitimes, orders for Nvidia’s AI GPUs have recently seen an uptick, subsequently boosting the wafer start rates at TSMC.

GH100 GPU 140 billion transistors

Both the A100 and H100 chips are manufactured by TSMC, the former using the 7nm process and the latter employing a customized 5nm process called 4N. Despite Nvidia’s efforts to meet the demand, the artificial intelligence wave has had a colossal impact. To circumvent regulations, Nvidia has specifically offered A800 and H800 products for the Chinese market, priced at 40% above the original suggested retail price. Even so, in the face of copious orders, delivery times have begun to be affected, and Nvidia aims to prioritize meeting demand outside China.

The current delivery time has been extended from the previous three months to six months, and in certain cases, the wait could be even longer. Some new orders are estimated to be completed only by December this year, meaning the wait time will exceed six months. The market’s demand for high-performance computing has, to a certain extent, disrupted the supply of gaming chips, as Nvidia plans to allocate more resources to the demands of artificial intelligence.