For the last three years, Nvidia(NASDAQ: NVDA) has been the most dominant company in the artificial intelligence (AI) landscape. The company’s graphics processing units (GPUs) are the backbone on which generative AI is developed.
Hyperscalers including OpenAI, Oracle, Meta Platforms, and cloud platforms such as Microsoft Azure and Amazon Web Services (AWS) have collectively spent hundreds of billions of dollars clustering Nvidia GPUs inside of data centers to build their AI infrastructure.
While Advanced Micro Devicesis largely perceived as Nvidia’s chief rival in the AI chip market, a new threat is emerging: Alphabet(NASDAQ: GOOGL)(NASDAQ: GOOG). The internet giant is making waves in the semiconductor industry thanks to rising interest in its custom hardware, known as tensor processing units (TPUs).
Let’s take a look at what Alphabet’s entrance in the chip market means for Nvidia as investments in AI infrastructure continue to unfold. Should Nvidia investors be worried? Read on to find out.
Image source: Getty Images.
Nvidia’s GPUs are versatile pieces of hardware. These chips are designed to work in clusters in parallel with the company’s CUDA software architecture. Taken together, Nvidia’s ecosystem can be used to train large language models (LLMs) or help build more robust applications across AI robotics, autonomous driving, and quantum computing.
TPUs, by contrast, are much more specialized. Rather than being a purpose-built piece of hardware, Alphabet’s TPUs should be classified as a custom application-specific integrated circuit (ASIC). This makes TPUs useful in extremely bespoke workloads such as deep learning.
Throughout the AI revolution, one of Alphabet’s biggest winners has been its cloud division: Google Cloud platform. In recent months, Google Cloud has won notable deals with OpenAI as well as a $10 billion contract with Meta Platforms.
What investors might not fully comprehend is that the introduction of hardware now makes TPUs an interesting selling point within Google’s broader cloud ecosystem.
Notably, Apple used TPUs to train its Apple Intelligence models. Meanwhile, Anthropic announced plans to expand its usage of Google Cloud in a deal featuring up to 1 million TPUs. Lastly, rumors are swirling that Meta is considering complementing its existing reliance on Google Cloud by deploying its own TPU clusters.
On the surface, accelerating TPU demand from big tech might sound alarming for Nvidia investors. However, there are some finer details that smart investors shouldn’t overlook.
While Anthropic’s relationship with Google Cloud is significant, the company also has strong ties with Nvidia. Just weeks ago, Anthropic agreed to purchase $30 billion of compute capacity from Microsoft Azure, which runs heavily on Nvidia’s GPUs.
On top of that, OpenAI recently struck a $38 billion deal with AWS to access Nvidia’s new GB200 and GB300 chips.
Furthermore, while Nvidia does not reveal the specifics around its customer concentration, many analysts on Wall Street suspect that the hyperscalers — including Alphabet — are among its largest chip buyers.
These are important details to understand. While TPUs represent potentially transformative dynamics in the AI chip market, many of their users — including Google itself — actually complement this custom hardware with Nvidia’s general purpose GPUs. Against this backdrop, TPUs don’t appear to be replacing GPUs at all.
Image source: Getty Images.
Management consulting firm McKinsey & Company is forecasting AI infrastructure to be a $7 trillion market by 2030, with roughly $5 billion of this spend allocated toward AI workloads.
Given the hyperscalers are accelerating their capital expenditures (capex), I feel confident that demand for Nvidia’s GPUs and accompanying data center services will remain robust for the foreseeable future.
It’s possible — even likely — that rival accelerators from AMD in combination with custom ASICs such as TPUs will eventually erode Nvidia’s pricing power in the chip market down the road.
Nevertheless, the introduction of TPUs is not a checkmate move by Google. If anything, complementing existing Nvidia infrastructure with custom chip designs reinforces how large of an opportunity AI infrastructure is becoming. In other words, AI chips are not a winner-take-all market.
For these reasons, I don’t think Nvidia investors need to panic. The company still remains the king of the chip realm and appears well positioned to thrive in the AI infrastructure era.
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Alphabet wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004… if you invested $1,000 at the time of our recommendation, you’d have $499,978!* Or when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $1,126,609!*
Now, it’s worth noting Stock Advisor’s total average return is 971% — a market-crushing outperformance compared to 195% for the S&P 500. Don’t miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
Adam Spatacco has positions in Alphabet, Amazon, Apple, Meta Platforms, Microsoft, and Nvidia. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Apple, Meta Platforms, Microsoft, Nvidia, and Oracle. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.