AI’s Growing Power Drain: Unveiling ChatGPT’s Energy Consumption
As the prevalence of various autogenerative artificial intelligence technologies burgeons, concerns about privacy and security have emerged alongside the conspicuous energy consumption incurred by their operation, which will undoubtedly capture the attention of many individuals.
According to traffic monitoring website Similarweb, OpenAI’s ChatGPT service has amassed approximately 1.6 billion global uses in March alone, a 60% increase compared to February and triple that of January. Much like Google search services, each instance of ChatGPT usage entails a myriad of computational processes and consequent energy depletion, which intensifies as usage accumulates.
Assuming an average of five inquiries per user for ChatGPT, approximately 80 billion questions were posed in March, equating to an average of 270 million inquiries daily. Presuming each query consists of around 30 words, this amounts to ChatGPT processing over 80 billion words daily. With NVIDIA A100 accelerators requiring approximately 0.35 seconds to deduce a single word, at least 780,000 hours would be expended daily on a single A100 accelerator.
These calculations exclude usage during peak periods and instances necessitating more extensive computational demands, not to mention other computational requirements. To maintain efficient responsiveness, ChatGPT relies on more than 32,400 A100 accelerators, far beyond the capabilities of a single unit.
Considering 32,400 A100 accelerators, approximately 4,000 NVIDIA DGX A100 supercomputers, each equipped with eight A100 accelerators, would be required. With an average price of $199,000 per DGX A100 supercomputer and a maximum operational capacity of 6.5 kW, the total cost approaches $800 million, while March’s energy consumption reaches 18.72 million kW.
Based on these calculations, each ChatGPT query consumes enough energy to illuminate a 60W light bulb for 140 seconds, accumulating a significant electricity expenditure. However, this does not account for energy consumed by system cooling and other operational components.
Nevertheless, as with current search services, continuous advancements in artificial intelligence algorithms, semiconductor manufacturing processes, and operational improvements are expected to alleviate the energy consumption issues associated with AI computation.