Nvidia said its AI chip revenue opportunity could exceed $1 trillion by 2027, highlighting growing demand for real-time inference computing across industries.
CEO Jensen Huang unveiled new processors and AI systems at the GTC conference, signaling a stronger push into AI inference, a rapidly expanding segment.
Inference computing, which powers real-time responses, is emerging as a key battleground where Nvidia faces rising competition from CPUs and custom chips.
Nvidia has long dominated AI training, but companies are shifting toward deploying models at scale, driving demand for efficient inference solutions.
Huang introduced a two-step inference approach, combining Nvidia’s Vera Rubin chips with licensed Groq technology to optimize processing speed and performance.
The company expects inference demand to surge as firms like OpenAI and Meta expand services to millions of users globally.
Nvidia also highlighted growing traction in CPUs, with Huang projecting a multi-billion-dollar business as adoption increases across AI deployments.
Despite investor concerns, analysts say Nvidia’s roadmap reinforces its leadership as AI moves from experimentation to large-scale commercialization.
