At CES 2026, Arm shared its technological observations on the post-AI era, focusing on "computing platforms." They believe 2025 marks a critical turning point in AI development, and by 2026, AI will no longer be confined to laboratory exploration but will be truly implemented in every aspect of life. Among these, "Physical AI" and "Edge AI" will be the twin engines driving the next wave of industrial transformation.
Physical AI and Edge AI: From the Cloud to the Real World
Arm points out that these two trends are converging at an accelerated pace. Physical AI enables cars and robots to understand their real-world environment and operate safely; while edge AI brings computational intelligence back from the cloud to the user's device, ensuring immediacy and privacy.
For example, robots can now learn through digital twin technology, XR devices have become simulation training grounds, and wearable devices can predict user needs. Behind these applications lies the high-performance and low-power computing power provided by the Arm computing platform, enabling a complete loop of "perception-inference-execution".
Autonomous driving: From assistance to full automation
In the automotive sector, Arm emphasizes that the industry is rapidly moving from "Software-Defined Vehicles" (SDV) to "AI-Defined Platforms".
For example, Rivian's self-developed autonomous driving platform is equipped with a custom chip based on the Arm architecture. Tesla's new generation AI5 chip is also built on the Arm platform, with performance 40 times higher than its predecessor, demonstrating the importance of energy efficiency for physical AI.
On the other hand, the NVIDIA DRIVE Thor platform provides core computing power for WeRide's Level 4 Robotaxi, as well as autonomous driving technology companies such as Nuro and Wayve.
Robots and AI PCs: The Perfect Balance Between Performance and Power Consumption
The robotics exhibition area was a major highlight of this year's CES. From Deep Robotics' wheeled robots to Roborock's cleaning robots, and Agility Robotics' humanoid robots, all demonstrated their ability to navigate autonomously and perform complex operations. Most of these robots use Arm-based computing modules such as NVIDIA Jetson Thor.
In personal computing, Windows on Arm has become mainstream, with more than 100 related models expected to be released this year. The Arm architecture allows laptops and tablets (such as the Xiaomi Mi Pad 7 Ultra) to maintain all-day battery life while delivering high performance, and supports on-device AI tasks such as translation and image generation without relying on cloud connections.
It is worth noting that the NVIDIA DGX SparkAI workstation is equipped with the GB10 super chip that integrates 20 Arm architecture cores, allowing developers to run large models with 1200 billion parameters directly on the desktop.
Wearable devices and smart homes: Your personal assistant who understands you better
The advantages of edge AI are clearly demonstrated in wearable devices. The new Ray-Ban Meta smart glasses, combined with a neural-controlled bracelet, can run spatial AI even under extreme power consumption, while the Ora Ring 4 smart ring can analyze health data around the clock.
In the smart home sector, with rising privacy awareness and the widespread adoption of the Matter standard, the computing hub is returning to the local device. Google Nest and smart TVs from brands including Samsung and LG are using Arm architecture processors as the home control hub, handling voice control and automation processes directly on the device, reducing the risk of data being uploaded to the cloud.
Arm points out that the intelligence of the future will be seamlessly embedded in all technology landscapes, and the Arm architecture is the cornerstone supporting this "ubiquitous intelligent future".



