Key Points
The artificial intelligence (AI) infrastructure build-out remains one of the biggest growth drivers the stock market has ever seen. However, the AI market is shifting in a way that could lead to new leaders emerging. The first phase of AI was all about training foundational large language models (LLMs), while the importance of inference and AI agents is dominating the second phase.
Nvidia (NASDAQ: NVDA) was the big winner in phase 1 of the AI build-out, as its graphics processing units (GPUs) became the engines for training AI models. The company created a wide moat through its CUDA software, which it had earlier seeded in universities and research labs conducting early AI work. As a result, most foundational AI code was written on Nvidia software and optimized for its chips.
Will AI create the world’s first trillionaire? Our team just released a report on the one little-known company, called an “Indispensable Monopoly” providing the critical technology Nvidia and Intel both need. Continue »
5 Stocks Our Experts Predict Could Double In the Next Year
By submitting your email, you'll also get a free pivot & flow membership. A free daily market overview. You can unsubscribe at any time.

Image source: Getty Images.
However, inference is not as technically demanding as LLM training, and developers today tend to work higher up in the software stack on open-source AI frameworks like OpenAI’s Triton, which helps level the playing field. While Nvidia is likely to remain the king of AI training and has better positioned itself for inference through its acquisition of Groq’s assets and its language processing units (LPUs), there are now other companies that look to be big winners with huge growth runways, including Broadcom (NASDAQ: AVGO) and Advanced Micro Devices (NASDAQ: AMD).
Broadcom: A winner in custom chips
One of the big current AI infrastructure trends is hyperscalers (owners of huge data centers) looking to diversify the AI accelerators they use, including developing their own. One of the companies they are increasingly turning to for help is Broadcom, which is a leader in ASICs (application-specific integrated circuits).
ASICs are custom chips that are hardwired for a specific purpose. They lack the flexibility of general-purpose GPUs, but they tend to perform well for their intended tasks while being more energy-efficient. This is particularly important when it comes to inference, since power usage is an ongoing and expensive cost.
Broadcom helped Alphabet develop its highly successful tensor processing units (TPUs). These chips, which Alphabet is also now letting a few select customers buy directly from Broadcom, are a huge driver for the company. Meanwhile, other hyperscalers, including OpenAI and Meta Platforms, are also using Broadcom to help them develop their own custom AI ASICs.
Broadcom has said it has a clear line of sight for over $100 billion in AI ASIC revenue alone in its fiscal 2027. At the same time, the company is a leader in the fast-growing data center networking space, which is becoming even more important as chip cluster sizes grow. Between these two opportunities, it is set to see huge growth.
AMD: An inference and agentic AI winner
Like Broadcom, AMD has an opportunity with inference. The company’s ROCm software platform has improved immensely over the past two years, and its modular chiplet design, which can pack in more memory, is well-suited for inference. Inference tends to be more memory-bound than computation-bound, and its new chip will have 1.5 times the memory capacity of Nvidia’s upcoming Rubin chips.
AMD has struck two large deals with OpenAI and Meta Platforms for six gigawatts worth of GPUs each, which are $100 billion deals. It is also believed that the company is working on a large GPU deal with Anthropic.
In addition to its GPU opportunity, the company is set to ride another powerful trend with agentic AI. It is the leader in data center central processing units (CPUs), and as AI agents rise, the GPU-to-CPU ratio in AI servers is expected to shift from 8-to-1 to 1-to-1. That’s because CPUs will need to handle the sequential reasoning and ability to work with other tools that AI agents require. AMD pegs this market at $120 billion over the next few years.
Between its GPU and CPU opportunities, AMD looks poised to be a big winner in the next phase of AI.
Should you buy stock in Advanced Micro Devices right now?
Before you buy stock in Advanced Micro Devices, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Advanced Micro Devices wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004… if you invested $1,000 at the time of our recommendation, you’d have $472,205!* Or when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $1,384,459!*
Now, it’s worth noting Stock Advisor’s total average return is 999% — a market-crushing outperformance compared to 208% for the S&P 500. Don’t miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
*Stock Advisor returns as of May 15, 2026.
Geoffrey Seiler has positions in Advanced Micro Devices, Broadcom, and Meta Platforms. The Motley Fool has positions in and recommends Advanced Micro Devices, Broadcom, Meta Platforms, and Nvidia. The Motley Fool has a disclosure policy.