Chip designer Nvidia NVDA-Q forecast fiscal fourth-quarter revenue above Wall Street targets
on Tuesday as supply-chain issues ease, but a cloudy outlook for China sales weighed on shares that have more than tripled in price this year.
Nvidia, which outsources manufacturing to chip makers like TSMC, has said it expects supply for its AI chips to improve each quarter, with the company making prepayments and placing noncancellable orders to ensure suppliers prioritize its chips.
Demand for artificial-intelligence servers has grown rapidly, with research firm TrendForce estimating shipments to rise about 40% this year, thanks to their use in powering products like OpenAI’s ChatGPT.
But the red-hot market for AI chips is playing out in the context of vastly expanded U.S. export controls on what Nvidia can sell to China. While the company said on Tuesday that it expects those lost sales will be more than made up for by demand from other countries in the short term, it conceded that “our competitive position has been harmed, and our competitive position and future results may be further harmed over the long-term” if rules are tightened further.
Shares slipped about 1% in after-hours trading following the quarterly results. The shares had risen more than 240% this year.
“Investors appear to be concerned over the negative impact of widening U.S. curbs on sales of its high-end chips to China,” said Jesse Cohen, senior analyst at Investing.com.
Nvidia also faces risks in Israel, whose military is embroiled in a conflict in Gaza and where Nvidia’s networking business is headquartered. Sales from that unit, whose gear is used in AI supercomputers, rose 155% compared with a year ago, as the company sold more equipment to support its supercomputer systems.
Nvidia said a significant number and percentage of the company’s employees based in Israel have been called up to active military duty, and if the war continues, their absence could hurt its future operations.
Despite the expanded China chip export curbs, analysts expect Nvidia’s order books to be full until at least August next year, as demand for its AI chips, especially in the United States, continues to outstrip supply.
The company expects current-quarter revenue of $20 billion, plus or minus 2%. Analysts polled by LSEG expect revenue of $17.86 billion.
Adjusted third-quarter revenue rose 206% to $18.12 billion, compared with estimates of $16.18 billion.
Quarterly data centre revenue jumped 41% to $14.51 billion, while gaming revenue was up 15% to $2.86 billion.
Excluding items, the company earned $4.02 per share, beating estimates of $3.37 a share.
But the company also gave some warnings, saying that a quarter of the company’s data centre segment sales come from China and that other regions such as the Middle East are now affected by new U.S. export controls.
“We expect that our sales to these destinations will decline significantly in the fourth quarter of fiscal 2024, though we believe the decline will be more than offset by strong growth in other regions,” Chief Financial Officer Colette Kress said in prepared remarks.
The chip designer has already come up with three new products for the Chinese market in response to the expanded U.S. controls on exports, a source and analysts said. Jacob Bourne, analyst at Insider Intelligence, said that those China-focused chips could consume vital research resources at Nvidia for products that could end up banned just like its first round of China market chips.
“Nvidia’s move to develop specialized chips for the Chinese market, while a strategic response to export restrictions, faces challenges,” Bourne said.
U.S. officials unveiled a new batch of restrictions in October and said they will continue to update them as needed.
Last week, the company also introduced a new AI chip called the H200, which will offer superior performance to Nvidia’s current top H100 processor.
The H200 includes additional high-bandwidth memory, one of the most expensive parts of the chip, which determines how much data it can crunch quickly.
Rival Advanced Micro Devices had earlier touted the quantity of high-bandwidth memory on one of its competing AI chips.
Major tech companies including Alphabet’s Google, Amazon.com and most recently Microsoft have announced AI chips produced by in-house design teams in addition to purchasing Nvidia’s hardware for their own data centres.
Building custom chips can cost hundreds of millions of dollars and take years, but gives the major cloud companies the ability to include features tied specifically to their AI needs.
Microsoft unveiled a duo of custom-designed computing chips earlier this month, one of which can run large language models.