Analysts say NVIDIA shipped over 900 tons of H100 compute cards in Q2

The old adage, “during a gold rush, sell shovels,” resonates profoundly with NVIDIA’s current status. As the tidal wave of generative artificial intelligence engulfs the tech realm, NVIDIA’s H100 compute card has emerged as the coveted passport into the AI domain, captivating major tech conglomerates. The insatiable demand for the H100 has rendered it a rare gem, propelling NVIDIA’s data center revenue to a staggering $10.32 billion in the second quarter of fiscal year 2024. The exact number of H100s they’ve dispatched remains shrouded in mystery.

Nvidia H100 computing card

The analytical institution, Omdia, postulates that NVIDIA shipped over 900 tons of H100 compute cards in the second quarter, predominantly catering to Artificial Intelligence (AI) and High-Performance Computing (HPC). Given that a single H100 compute card, inclusive of its cooler, tips the scale at over 3 kilograms, it’s extrapolated that NVIDIA’s Q2 shipments for the H100 exceeded 300,000 units. Omdia underscores that this is an approximate computation, not an official datum. The rationale behind gauging graphics card shipments by weight remains an enigma, perhaps stemming from shipment data acquired from an undisclosed supplier.

While 900 tons of H100s equate to the heft of 4.5 Boeing 747 aircraft, it’s palpably insufficient to satiate the global voracity for data center GPUs. The relentless surge in AI requisites has catalyzed meteoric growth in NVIDIA’s card shipments. Informed sources have intimated that NVIDIA harbors ambitions to elevate the production of GH100 chips from half a million this year to a lofty 1.5 to 2 million, with the lion’s share earmarked for the coveted H100 compute card.