2024 will be an important year for AMD in the field of AI

In a recent interview, Dr. Lisa Su, the esteemed CEO of AMD, articulated her vision that artificial intelligence (AI) is poised for exponential growth in the forthcoming decade. She forecasts that within a span of three to five years, the market capitalization for AI-related products might burgeon to approximately $150 billion. Reflecting its commitment to this burgeoning frontier, AMD has already earmarked AI as its paramount domain for resource allocation and investment, envisioning a future where AI technology is seamlessly integrated into all its products.

As reported by Wccftech, industry analysts are sanguine about the prospects of AMD’s AI endeavor, predicting a significant trajectory of growth in the ensuing years. It’s anticipated that AMD’s AI chip shipments will witness a precipitous escalation between 2024 and 2025, concomitantly driving revenue surges.

When AI chips are the topic of discourse, NVIDIA invariably reigns supreme in the collective consciousness. Owing to their prodigious investments over the past decade, buttressed by a rich tapestry of software assets, NVIDIA has sculpted an unassailable ecosystem advantage for AI. However, with AMD’s renewed vigor and substantial investments in AI, the landscape might be on the cusp of a paradigm shift. In a bid to fortify its AI prowess, AMD has recently acquired both Mipsology and Nod.ai, two avant-garde AI software entities. Bridging the chasm with NVIDIA in the AI realm remains an arduous journey for AMD, potentially spanning several formative years.

Survey data suggests that by 2024, AMD’s AI chip shipments, predominantly led by the Instinct MI300A, are projected to constitute roughly 10% of NVIDIA’s volume, with Microsoft emerging as its most significant client, accounting for an overwhelming majority of over 50%. Amazon trails next in line, while both Meta and Google are currently in the evaluation phase with AMD’s prototypes. Fast forward to 2025, and AMD’s AI chip shipments are projected to burgeon to 30% of NVIDIA’s volume if not more.

As delineated by AMD’s official communiqué, the Instinct MI300A epitomizes a harmonious amalgamation of CPU and GPU. It boasts six XCDs (encompassing up to 228 CU / CDNA 3 architecture), three CCDs (with up to 24 cores / Zen 4 architecture), and eight HBM3 stacks (totaling 128GB). This APU design evidently resonates with the discerning needs of its clientele. In a financial discourse during the second quarter of 2023, Dr. Lisa Su further unveiled that the development of the next-generation Instinct MI400 series is already in full swing.