nvidia‘s shares neared a market capitalization of $1 trillion in extended trading on Wednesday after it reported a surprisingly strong forward-looking outlook, with chief executive Jensen Huang saying the company was in for a “record year.” giant”.
Sales rose due to increased demand for the graphics processors (GPUs) made by Nvidia, which power artificial intelligence applications like those of Google, Microsoft and OpenAI.
Demand for AI chips in data centers propelled Nvidia to generate sales of $11 billion during the current quarter, beating analyst estimates of $7.15 billion.
“The critical point was the generative AI,” Huang said in an interview with CNBC. “We know CPU scaling has slowed down, we know accelerated computing is the way to go, and then the killer app came along.”
Nvidia thinks it’s undergoing a distinct shift in the way computers are built that could result in even more growth: Parts for data centers could even become a $1 trillion market, Huang says.
Historically, the most important part of a computer or server has been the central processor, or CPU. That market was dominated by Intelwith AMD as his main rival.
With the advent of computing-intensive AI applications, the GPU has taken center stage, and the most advanced systems use up to eight GPUs on one CPU. Currently, Nvidia dominates the AI GPU market.
“The data center of the past, which mainly consisted of CPUs for file recovery, will be generative data in the future,” Huang said. “Instead of recovering data, you’re going to recover some data, but you have to generate most of the data using AI.”
“So instead of millions of CPUs, you’re going to have a lot fewer CPUs, but they’re going to be connected to millions of GPUs,” he continued.
For example, Nvidia own DGX systemswhich are essentially a training AI computer in a single box, they use eight of Nvidia’s high-end H100 GPUs and just two CPUs.
Google‘s a3 supercomputer it combines eight H100 GPUs along with a single high-end Xeon processor made by Intel.
That’s one of the reasons why Nvidia’s data center business grew 14% during the first calendar quarter versus flat growth for AMD’s data center unit and a 39% decline in the business unit. of Intel data centers and AI.
Also, Nvidia GPUs tend to be more expensive than many mainframe processors. Intel’s latest generation of Xeon CPUs can cost up to $17,000 at list price. A single Nvidia H100 can sell for $40,000 on the secondary market.
Nvidia will face increased competition as the AI chip market heats up. AMD has a competitive GPU business, especially in gaming, and Intel also has its own line of GPUs. Startups are creating new types of chips specifically for AI and mobile-focused companies like Qualcomm and Apple keep pushing technology so that one day it can run in your pocket, not on a giant server farm. google and Amazon they are designing their own AI chips.
But Nvidia’s high-end GPUs remain the chip of choice for today’s companies building apps like ChatGPT, which are expensive to train by crunching terabytes of data and expensive to run later in a process called “inference,” that the model uses to generate text, images, or make predictions.
Analysts say that Nvidia remains at the forefront of AI chips due to its proprietary software that makes it easy to use all of the GPU’s hardware features for AI applications.
Huang said Wednesday that the company’s software would not be easy to replicate.
“You have to design all the software and all the libraries and all the algorithms, integrate them and optimize the frameworks, and optimize them for the architecture, not just one chip but the architecture of an entire data center,” he said. he told her on a call with analysts.