NVIDIA (NVDA) vs. Advanced Micro Devices (AMD): Which Stock Is Proving to Be the Better Long-Term AI Buy

After its earnings release on May 24, the Santa Clara-based graphics chip maker NVIDIA Corporation (NVDA) stole the thunder by becoming the first semiconductor company to hit a valuation of $1 trillion.

NVDA has also blown away Street expectations ahead of its quarterly earnings release on August 23, with profits for the current quarter expected to be at least 50% higher than analyst estimates and the momentum expected to continue in the foreseeable future.

On the other hand, since its humble beginnings as a supplier for Intel Corporation (INTC), Advanced Micro Devices, Inc. (AMD) has come a long way. During its earnings release for the second quarter, despite persistent weakness in the PC market, the company’s result topped analyst estimates.

While NVDA has carved its niche and cornered a significant share of the GPU domain through advancements in parallel (and consequently accelerated) computing which began back in 2006 with the release of a software toolkit called CUDA, Chair and CEO Dr. Lisa Su is widely credited with AMD’s turnaround and transition from being widely dismissed due to performance issues and delayed releases to being the only company in the world to design both CPUs and GPUs at scale.

The New (Perhaps Only) Game in Town

As a general-purpose technology, such as the steam engine and electricity, Artificial Intelligence (AI) that has already been touching and influencing all facets of our life, including how we shop, drive, date, entertain ourselves, manage our finances, take care of our health, and much more.

However, late in November of last year, when OpenAI opened its artificial intelligence chatbot, ChatGPT, to the general public, all hell broke loose. The application took the world by storm. It amassed 1 million users in five days and 100 million monthly active users only two months into its launch to become the fastest-growing application in history.

The generative AI-powered application’s capability to provide (surprisingly) human-like responses to user requests equally fascinated and concerned individuals, businesses, and institutions with the possibilities of the technology. A large language model or LLM powers ChatGPT. This gives the application the ability to understand human language and provide responses based on the large body of information on which the model has been trained.

NVDA is reaping the rewards for all that invisible work done in the field of parallel computing. Parallel computing was ideal for artificial neural networks' deep (machine) learning. As a result of that head start in the AI tech race, its A100 chips, which are powering LLMs like ChatGPT, have become indispensable for Silicon Valley tech giants.

To put things into context, the supercomputer behind OpenAI’s ChatGPT needed 10,000 of NVDA’s famous chips. With each chip costing $10,000, a single algorithm that’s fast becoming ubiquitous is powered by semiconductors worth $100 million.

However, AMD isn’t too far behind either. According to Dr. Su, Data Center is the most strategic piece of business as far as high-performance computing is concerned. AMD underscored this commitment with the recent acquisition of data center optimization startup Pensando for $1.9 billion.

At the premiere, AMD’s ambitions to capitalize on the AI boom were loud and clear, with the launch of MI300X (a GPU-only chip) as a direct competitor to NVDA’s H100. The chip includes 8 GPUs (5nm GPUs with 6nm I/O) with 192GB of HBM3 and 5.2TB/s of memory bandwidth.

AMD believes this will allow LLMs’ inference workloads that require substantial memory to be run using fewer GPUs, which could improve the TCO (Total Cost of Ownership) compared to the H100.

The Road Ahead

The optimism surrounding both companies is justified.

With NVDA’s presence in data centers, cloud computing, and AI, its chips are making their way into self-driving cars, engines that enable the creation of digital twins with omniverse that could be used to run simulations and train AI algorithms for various applications.

On the other hand, AMD has also been training its guns to exploit the burgeoning AI accelerator market, projected to be over $30 billion in 2023 and potentially exceed $150 billion in 2027.

AMD is one of the few companies making high-end GPUs needed for artificial intelligence. With AI being seen as a tailwind that could drive PC sales, the company announced plans to launch new Radeon 7000 desktop GPUs at its quarterly earnings release. It is being speculated that the GPU will come with two 8-pin PCIe power connectors and four video out ports, including three DisplayPort 2.1 and one HDMI 2.1.

Caveats

AMD existed as both a chip designer and manufacturer, at least until 2009. However,  significant capex requirements associated with manufacturing, amid financial troubles in the wake of the Great Recession, compelled the company to demerge and spin off its fab to form GlobalFoundries Inc. (GFS), which has been focused on manufacturing low-end chips ever since.

Today, both NVDA and AMD operate as fabless chip companies. Hence, both companies face risks of backward integration by companies such as Apple Inc. (AAPL), Amazon.com, Inc. (AMZN), and Tesla Inc. (TSLA) with the wherewithal to develop the intellectual capital to design their own chips.

Moreover, almost all of the manufacturing has been outsourced to Taiwan Semiconductor Manufacturing Company Ltd. (TSM), which has yet to diversify significantly outside Taiwan and has become the bone of contention between the two leading superpowers.

With geopolitical risk being the potential Achilles heel for both companies, their efforts toward geographical diversification also receive much-needed political encouragement through the Chips and Science Act.

Dr. Su, who also serves on President Biden’s council of advisors on science and technology, pushed hard for the passage of the Act. It is aimed at on-shoring and de-risking semiconductor manufacturing in the interest of national security by setting aside $52 billion to incentivize companies to manufacture semiconductors domestically.

Bottom Line

Given its massive importance and cornucopia of applications, it’s hardly surprising that Zion Market Research forecasts the global AI industry to grow to $422.37 billion by 2028. Hence, this field has understandably garnered massive attention from investors who are reluctant to miss the bus on such a watershed development in the history of humankind.

Hence, in view of product diversification, increasing traction in the GPU segment, and relatively higher valuation comfort, investors in AMD could benefit from more sustained upside potential compared to NVDA.

Chips and AI Advanced Micro Devices Inc. (AMD)'s Next-Level Breakthroughs!

Last month, we gauged the prospects of two semiconductor giants, NVIDIA Corporation (NVDA) and Intel Corporation (INTC), which have carved out their niches and cornered a significant share of the GPU and CPU domains, respectively. In this article, we have talked about another chip company and its agile efforts to grab the best of both worlds while creating a widespread following of its own.

Founded in 1968 by a group of 8 men led by the larger-than-life Jerry Sanders, Advanced Micro Devices, Inc. (AMD) released its first product in 1970 and went public in 1972. Despite starting life as a supplier for INTC, AMD parted ways with its client in the mid-80s, and by the late 80s, it reverse-engineered INTC’s products to make its own chips that were compatible with INTC’s software.

AMD existed as both a chip designer and manufacturer, at least until 2009. However, significant capex requirements associated with manufacturing, amid financial troubles in the wake of the Great Recession, compelled the company to demerge and spin off its fab to form GlobalFoundries Inc. (GFS), which has been focused on manufacturing low-end chips ever since.

With the acquisition of ATI, a major fabless chip company, in 2006, AMD began shifting its focus toward chip designing and turned to Taiwan Semiconductor Manufacturing Company Ltd. (TSM) as its exclusive chip manufacturer.

With manufacturing no longer weighing it down, AMD started catching INTC with its Zen line of CPUs. Earlier this year, the former made history by surpassing the latter’s market cap for the first time ever. Chair and CEO Dr. Lisa Su is widely credited with the turnaround and transition from being widely dismissed due to performance issues and delayed releases to being the only company in the world to design both CPUs and GPUs at scale.

We look at how Dr. Su and her team’s unwavering focus on great products, customer relations, and simplifying the company’s structure to respond to the dynamic business with agility are shaping AMD’s offerings in each product category.

CPU Portfolio

Despite a conservative outlook, AMD believes its Genoa CPU processors are superior to competitive offerings in terms of performance and efficiency across diverse workloads, including AI. During the recent AMD Data Center & AI Tech Premiere, the company expanded its EPYC server CPU portfolio by launching the highly anticipated Bergamo EPYC CPUs optimized for cloud environments.

Given the focus on single-threaded performance and energy efficiency, Meta Platforms, Inc. (META), which has collaborated with AMD to customize the design of the Bergamo server, reported seeing 2.5 times greater performance than AMD's previous generation Milan CPUs and notable improvements in total cost of ownership (TCO).

In addition, AMD also introduced Genoa-X as another workload-optimized alternative to Genoa for faster general-purpose computing and optimal technical computing tasks. The company also updated that its upcoming server CPU product, Turin, has shown promising initial results and remains on schedule for a 2024 release.

Data Center Portfolio

According to Dr. Su, Data Center is the most strategic piece of business as far as high-performance computing is concerned. AMD underscored this commitment with the recent acquisition of data center optimization startup Pensando for $1.9 billion.

At the premiere, AMD’s ambitions to capitalize on the AI boom were loud and clear, with the launch of MI300X (a GPU-only chip) as a direct competitor to NVDA’s H100. The chip includes 8 GPUs (5nm GPUs with 6nm I/O) with 192GB of HBM3 and 5.2TB/s of memory bandwidth.

AMD believes this will allow LLMs’ inference workloads that require substantial memory to be run using fewer GPUs, which could improve the TCO compared to the H100.

Lastly, the company aims to address the growing AI accelerator market, projected to be over $30 billion in 2023 and potentially exceed $150 billion in 2027.
Gaming and Other Applications.

While INTC and NVDA control most of the CPU and GPU market, respectively, AMD dominates gaming by designing 83% of gaming console processors.

The recently launched AMD Ryzen 5 5600X3D is equipped with AMD’s revolutionary 3D V-Cache technology. Despite being close to both the Ryzen 7 5800X3D and the non-3D Ryzen 5 5600X in terms of specifications, it comes with a lot of L3 cache, giving it an edge over the latter, thereby improving gaming performance.

Moreover, with Moore’s Law, which is the core of computer chip advancement, showing visible signs of a slowdown and the 5-decade-old x86 architecture gradually but surely being replaced by ARM, general-purpose computing using CPUs is making way for more customized solutions.

That has prompted AMD to acquire Xilinx for $49 billion to close one of the biggest acquisitions in semiconductor history. The investee is known for its reprogrammable adaptive chips called Field-Programmable Gate Arrays or FPGAs, which have diverse applications, such as robotics, telecommunications, agriculture, and space exploration.

As a result, AMD is expanding its footprint from PCs and supercomputers to Teslas and Mars Land Rover.

Road Ahead

Despite its future readiness, geopolitical tensions between the U.S. and China could turn out to be the Achilles heel for AMD since all of its chips are made in China and Taiwan. Also, Mainland China accounts for roughly 30% of the company’s revenues.

Dr. Su also serves on President Biden’s council of advertisers on science and technology, which pushed hard for the recent passage of the Chips and Science Act, aimed at on-shoring and de-risking semiconductor manufacturing in the interest of national security by setting aside $52 billion to incentivize companies to manufacture semiconductors domestically.

Geographical diversification, as a result of this Act, could act as a hedge against geopolitical tensions for AMD by reducing reliance on Asian manufacturing.

Bottom Line

As AMD continues to advance its x86 core computing chips along with diversifying to accommodate high-performance and customized computing, its more than 70% increase in stock price since the beginning of the year (and coincidentally during the AI wave) could be indicative of a company that is poised to gain market share and capitalize on the expanding demand for AI technology in various industries.