Nvidia’s CEO Jensen Huang stated that reasoning models require 100 times more computing resources than traditional models. This comes amid increased competition in inference computing, with analysts noting that competitors are targeting this market. The demand for high-performance chips is expected to grow, driven by the increasing complexity of AI models.
Huang praised DeepSeek’s open-source reasoning model R1, saying it has become a benchmark for developers. However, he emphasized that future reasoning models will require even more compute resources. Cloud providers continue to support Nvidia’s powerful chips, but analysts warn that competition is taking its toll on the company’s market share.
The increased demand for inference computing has been driven by the maturation of AI applications. New companies like Tenstorrent and Etched have emerged in recent months, backed by significant funding. Investors are concerned that custom AI chips from cloud giants like Google and Amazon could erode Nvidia’s lead in this area.
Despite beating revenue expectations, Nvidia’s stock performance was subdued, with some analysts predicting a decline in market share to 50% in inference computing. The company’s earnings call highlighted the challenges of maintaining its dominance in the AI compute market as competition intensifies.
Source: https://www.businessinsider.com/nvidia-ceo-jensen-huang-says-reasoning-models-require-more-compute-2025-2