During a media Q&A session at CES 2026, NVIDIA CEO Jensen Huang once again shared his grand vision for the future of artificial intelligence. In response to media questions about Moore's Law, energy efficiency, supply in the Chinese market, and the next-generation architecture, Huang stated that we are currently at the beginning of a new industrial revolution, and data centers are transforming into "AI factories".
A one-year update schedule: from Blackwell to Vera Rubin
In response to media inquiries about how NVIDIA maintains its phenomenal growth rate, Jensen Huang provided a clear technology roadmap. He pointed out that NVIDIA is currently moving at full speed, releasing a new architecture every year. From Hopper and Blackwell to the latest Vera Rubin architecture, the performance improvements of each generation are not linear but exponential leaps.
"We're putting Moore's Law on steroids," Huang quipped. He explained that by co-designing across the entire technology stack—from CPUs, GPUs, and network chips to switches—NVIDIA is able to achieve performance leaps in a year that would otherwise take years. For example, Blackwell compared to Hopper, and the future Vera Rubin compared to Blackwell, both achieve a 10x improvement in inference cost and energy efficiency.
Energy efficiency equals revenue: The new economics of AI factories
In response to concerns about the energy consumption of AI, Jensen Huang put forward a counterintuitive but commercially sound view: extreme energy efficiency is the key to customer profitability.
He analyzed that modern data centers are limited by power supply. With a fixed total power supply, if the energy efficiency of chips (Performance per Watt) is improved tenfold, customers can produce ten times more tokens with the same power consumption. For "AI factories" that regard AI tokens as products, this directly means a tenfold increase in revenue.
Therefore, NVIDIA's pursuit of ultimate performance is not simply for speed, but also to reduce the cost per token, which is the economic driving force behind the popularization of generative AI.
Supply Chain and Geopolitics: H200's China Strategy and Memory Demand
When asked about export licenses for the Chinese market and the H200 chip, Jensen Huang confirmed that they are working closely with the US government to ensure compliance with export control regulations while meeting the market's demand for computing power. He stated that despite geopolitical challenges, customer demand remains strong, and the H200 will be shipped in accordance with regulations.
Furthermore, regarding HBM high-bandwidth memory, which is indispensable for AI chips, Jensen Huang also admitted that NVIDIA has become one of the world's largest memory buyers. He emphasized NVIDIA's close cooperation with suppliers such as SK Hynix, Micron, and Samsung, and pointed out that HBM's production capacity and technological advancements are the key lifelines supporting the successful mass production of Blackwell and even the Vera Rubin architecture.
AI in the consumer and physical world: Its comprehensive penetration from gaming to self-driving cars
In addition to the enterprise side, Jensen Huang also responded to questions about consumer-grade graphics cards and physical AI. He emphasized that AI technology is not only used in servers, but the same architecture optimization technologies (such as DLSS) are also benefiting GeForce RTX players.
In the field of autonomous vehicles, he reiterated the importance of safety, ensuring that safety is never sacrificed while pursuing autonomous driving through the Drive Thor platform and redundant system design.
Furthermore, even though the current market discussion of Level 2++ autonomous driving classification differs from the definition proposed by the Society of Automotive Engineers (SAE International), Huang Renxun believes that even though the current technology has far exceeded the scope of traditional classifications, it still falls under the category of drivers needing to pay attention to the traffic ahead and comply with local regulations. Therefore, he does not believe there will be a gap in safety perception.
Analysis perspective: An attempt to shift market focus from simply "chip computing power" to "system-level output efficiency"
In the interview, Jensen Huang repeatedly emphasized "full-stack optimization," implying that in the future AI race, simply possessing powerful chips is not enough; one must have the ability to perfectly integrate computing, networking, and software to survive in this new industrial revolution. NVIDIA is attempting to build an insurmountable moat for its competitors through an extreme product update cycle.






