Jensen Huang Just Delivered Incredible News for Nvidia Stock Investors

Motley Fool
2025.06.06 08:23
portai
I'm PortAI, I can summarize articles.

Nvidia's CEO Jensen Huang announced strong future demand for the company's GPUs, particularly for AI applications, during a recent conference call. Nvidia's data center revenue surged 73% year-over-year, reaching $39.1 billion, and is expected to benefit from a projected $1 trillion annual AI infrastructure spending by 2028. The stock is currently trading at a P/E ratio of 44.3, which is 26% lower than its 10-year average, indicating it may be undervalued. Investors are optimistic about Nvidia's growth potential in the AI market.

Nvidia (NVDA -1.44%) is the world's leading supplier of graphics processing units (GPUs) for data centers, which are the key pieces of hardware for developing artificial intelligence (AI). Since the beginning of 2023, Nvidia's market value has grown by an eye-popping $3 trillion thanks to soaring demand for those chips, but the company is just getting warmed up.

Every new generation of AI model so far has required significantly more computing capacity than the last, which is a major tailwind for Nvidia's hardware sales. During a recent conference call where management discussed the company's results for the fiscal 2026 first quarter (ended April 27), CEO Jensen Huang made a series of comments about future demand that should be music to investors' ears.

Nvidia stock looks attractive right now relative to its history, so here's why it could be a great buy on the back of Huang's latest remarks.

Image source: Nvidia.

Demand for computing capacity could soar a thousand times

Nvidia's H100 GPU, which was built on the company's Hopper architecture, was the top-selling data center chip for AI development during 2023 and for most of 2024. It was designed for both training and inference workloads; training is when developers feed mountains of data into AI models to make them "smarter," and inference is the process by which AI models turn that data into responses for the end user.

In 2023 and 2024, AI chatbot applications were great at generating one-shot responses, meaning they prioritized speed when compiling information and feeding it to the end user. Those applications were revolutionary at the time, but the underlying large language models (LLMs) occasionally made mistakes or provided incomplete answers.

In 2025, next-generation "reasoning" models are solving that problem by autonomously cleaning up errors in the background before rendering responses. To put it another way, they spend time thinking to ensure the information they provide is as accurate as possible. This comes with a downside -- reasoning models take longer to generate answers, and they require significantly more computing capacity than their predecessors.

Nvidia designed a new architecture called Blackwell to power those inference workloads, and it produces up to 40 times more performance than the Hopper architecture. But it might not be enough, because Huang says some reasoning models consume a staggering 1,000 times more tokens (words, punctuation, and symbols) than the old one-shot LLMs.

Huang says the Blackwell-based GB200 GPU NVLink 72 is the best system on the market for reasoning inference workloads right now, but Nvidia is also gearing up to ship its new Blackwell Ultra GB300 GPUs this year, which will offer even more performance. Hardware needs to keep getting better; otherwise, reasoning models will take too long to generate responses, and people simply won't use them anymore.

An annual opportunity worth $1 trillion

Nvidia's data center business generated $39.1 billion in revenue during the fiscal 2026 first quarter, a 73% increase from the year-ago period. It now accounts for 89% of the company's total revenue, so it's the main point of focus for investors.

At Nvidia's GTC conference in March, Huang told the audience that AI infrastructure spending will keep growing and could surpass $1 trillion annually by 2028, thanks to the incredible demand for inference computing capacity from reasoning models. Then, in his conference call with investors for the fiscal 2026 first quarter, he said Nvidia is on track to fill "most" of that demand, so the company's data center revenue probably still has room to soar.

Nvidia's hardware remains leaps and bounds ahead of the competition. Plus, it isn't just about chips -- the company has the entire stack covered, selling networking equipment and even a software platform called CUDA, which developers can use to optimize GPUs for specific tasks. Once data center operators are locked into the Nvidia ecosystem, it becomes very inconvenient (and expensive) to switch.

Nvidia stock looks like a bargain relative to its history

Based on Nvidia's $3.19 in trailing-12-month earnings per share (EPS), its stock is trading at a price-to-earnings (P/E) ratio of 44.3. That's a 26% discount to its 10-year average of 59.8, suggesting it's undervalued right now.

Plus, Wall Street's consensus estimate (provided by Yahoo! Finance) suggests Nvidia will generate $4.28 in EPS for the whole of fiscal 2026, placing its stock at a forward P/E ratio of just 32.1.

NVDA PE Ratio data by YCharts

In other words, Nvidia stock would have to soar by 38% over the next year or so just to maintain its current P/E ratio, or by 86% for its P/E ratio to trade in line with its 10-year average, assuming Wall Street's EPS forecast proves to be accurate.

However, investors should stay focused on the longer term, because if Jensen Huang is right about where inference demand and data center spending are headed, Nvidia's stock could be orders of magnitude higher than where it is today.