- Chip giant Nvidia has long positioned its graphics processing units (GPUs) as ideal for training and deploying artificial intelligence (AI) applications
- The current popularity of generative AI applications, such as ChatGPT, has driven increasing demand for the vendor’s products
- Its sales forecast for the current quarter lit a fire under its share price, which jumped by 28%
If you’re looking for further evidence of how the growing interest in generative AI (GenAI) applications is affecting the tech sector, look no further than Nvidia’s fiscal first-quarter results and share price hike in the past 24 hours.
The Santa Clara, California-based chip giant has witnessed a spike in demand, mainly from the large cloud platform companies, for its graphics processing units (GPUs) that can be used to train and deploy AI applications: The GPUs can be used for these tasks thanks to the company’s CUDA platform and API, which enables GPUs to be used for general purpose computing tasks.
That spike in demand led to higher-than-expected fiscal first-quarter sales and a dramatic boost in product orders.
For the three months to 30 April 2023, Nvidia reported revenues of $7.19bn, much higher than the $6.52bn expected by Wall Street financial analysts. Its operating profit of $2.14bn was also higher than expected.
But it’s Nvidia’s forecast for the current quarter, which ends in late July, that is really showing how GenAI-driven demand has lit a fire under its business: The company expects revenues to be around $11bn, about 50% higher than had been expected by those who track the company.
“The computer industry is going through two simultaneous transitions – accelerated computing and generative AI,” noted Jensen Huang, Nvidia’s founder and CEO, in the company’s earnings report. “A trillion dollars of installed global datacentre infrastructure will transition from general purpose to accelerated computing as companies race to apply generative AI into every product, service and business process. Our entire datacentre family of products — H100, Grace CPU, Grace Hopper Superchip, NVLink, Quantum 400 InfiniBand and BlueField-3 DPU — is in production. We are significantly increasing our supply to meet surging demand for them,” he added.
The news had investors racing to snap up Nvidia’s stock, which gained 25% in value to hit $381.85 in early trading on the Nasdaq exchange on Thursday, taking the company’s market valuation to just over $940bn. By contrast, Intel’s stock is down by 4.4% to $27.73 and its market valuation is $117bn, while AMD, which has a range of AI chips coming out soon, saw its share price benefit from Nvidia’s success, as its stock gained 8% in value to hit $116.97, giving it a market valuation of almost $190bn.
But despite AMD’s impending rivalry, analysts expect Nvidia to benefit from the shift towards AI-enabled enterprise processes for some time.
“Nvidia is swimming in a tsunami of orders for its products, which has meant that FQ2 [second fiscal quarter] guidance was way ahead of expectations, and the company also hinted that even more is to come in the second half of the fiscal year,” noted long-time tech industry analyst Richard Windsor on his Radio Free Mobile blog, who also noted that Nvidia is “ramping up sourcing of manufacturing capacity for the second half [of the fiscal year] in a clear sign that it expects demand to go up again.”
And there’s good reason for that, noted Windsor.
“Nvidia is currently by far the market leader in the supply of silicon chips for AI training as it is more than five years ahead of its peers and as such has locked down the developer community. AI developers love using the CUDA platform, which is only available for Nvidia silicon, which is exactly how Nvidia’s business model works,” added the analyst.
As a result, it has a position that is the envy of its chip rivals. “Nvidia can lay claim to being the picks and shovels of the AI gold rush and its competitors are really struggling to make any dent in its dominance, meaning that the short- to medium-term outlook is excellent.”
But that position won’t last forever, noted Windsor, as “the hype cycle of super-intelligent AI will hit the buffers sooner or later” because GenAI applications “remain as stupid as ever despite having incredible data retention and regurgitation skills.” However, “while generative AI dominates public discourse… Nvidia will do very well,” added the analyst.
- Ray Le Maistre, Editorial Director, TelecomTV