With the fire of ChatGPT, Microsoft has further expanded its partnership with OpenAI. A few days ago, Microsoft has announced a new Bing search engine service based on ChatGPT, but the service is not yet open to everyone and needs to experience quota. Demand on the application side will drive the upstream industry. According to Wccftech, NVIDIA computing cards used in the AI field may be in short supply.
Language, image, and video generation tools similar to ChatGPT rely heavily on AI computing power. According to FierceElectronics, the beta version of ChatGPT was trained on 10,000 Nvidia computing cards, but since gaining public attention, the system has been overwhelmed and unable to meet the demands of a large number of users. So OpenAI announced the ChatGPT Plus subscription plan, which is not only available during peak hours, but also provides faster response time and priority access to new features and improvements. Some analysts estimate that ChatGTP currently uses about 25,000 Nvidia computing cards for training, an increase of 15,000 compared to the previous beta version.
As giants such as Microsoft and Google have integrated chatbots with search engines, Forbes predicts that if chatbots are used for every Google search, about 512,820 HGX A100 servers will be needed, a total of 4,102,568 A100 computing cards and about $100 billion will be needed in terms of servers and networks alone.
The strong demand for computing cards in the market may be good news for Nvidia. Nvidia may announce its financial report for the fourth quarter of the fiscal year 2023 on February 22, 2023. Previously, Nvidia expected revenue in the fourth quarter of the fiscal year 2023 to be $6 billion, plus or minus 2%