AMD, a U.S. multinational tech firm, has unveiled plans to roll out its next-generation Helios AI rack at the CES 2026 in Las Vegas. The company’s CEO, Lisa Su, provided the first look at the Helios system during her keynote, offering more details on its design and construction.
CEO Su showcased a large Helios rack unit on stage, touting its superior performance as a direct shot at Nvidia’s Vera Rubin NVL72, also unveiled at the CES 2026 event. Nvidia has set the standard for rack-scale systems, which it debuted as its latest offering.
According to Su, Helios will directly compete with Nvidia’s NVL systems. AMD aims to match 72 of its MI455X chips with its latest NVL72’s 72 Rubin GPUs.
AMD further announced plans to expand its efforts by providing more information about its upcoming MI500 series GPUs, which it claims will deliver up to 1,000 times the AI performance of Mi300X GPUs. Su emphasized that this performance increase will be necessary over the coming years, adding that by the next five years, some five billion people will actively use AI daily.
AMD also unveiled its new line of Ryzen AI 400 series PC chips, while Su showcased the MI-455 processors. She noted that the MI455 processors are an integral part of the data centers powering AI programs.
AMD also showcased its Ryzen AI Pro 400 series chips, which will take on Intel’s new Core Ultra 3 processors. The Core Ultra 3 processors are built on Intel’s new 18A process technology.
Su further invited Generative Bionics’ CEO, Daniele Pucci, on stage to reveal the company’s humanoid robot, GENE.01, for the first time. AMD’s GPUs and CPUs power the robot, and it is designed to operate in industrial environments.
Su observed that tech firms will need to increase global computing capacity by at least 100 times in the coming few years. The growth is expected to benefit both Nvidia and AMD, which have so far increased their market caps to $4.5 trillion and $359 billion, respectively.
AMD also unveiled its latest Ryzen AI Max+ chips for light workstations, mini-PCs, and laptops. It also showed off its Ryzen Halo developer platform. The mini-PC enables developers to build AI models locally, rather than relying on cloud-based solutions.
The Halo developer platform competes with Nvidia’s rival DGX Spark mini-PC, which is worth nearly $4,000. However, AMD has not yet disclosed pricing details for Halo.
Meanwhile, Nvidia also unveiled Rubin Platform, combining Rubin GPUs and Vera CPU to create a single Vera Rubin processor. Nvidia describes the Rubin platform as an ideal agentic AI with advanced reasoning models.
The Rubin platform, in addition to the Rubin GPUs and the Vera CPUs, includes the Nvidia BlueField-4 DPU, Nvidia NVLink 6 Switch, Nvidia Spectrum-6 Ethernet Switch, and Nvidia ConnectX-9 SuperNIC.
Combining several NVL72s results in Nvidia’s DGXX SuperPOD AI supercomputer. Hyperscalers, including Microsoft, Amazon, Meta, and Google, are spending billions of dollars to acquire these large systems.
Nvidia also noted that the Rubin platform is more efficient than its predecessors, which will likely result in a fourfold reduction in the number of GPUs needed to train the same MoE systems. Reducing the number of GPUs means the extra ones can be assigned to different tasks. Rubin will also reduce inference token costs by up to 10 times.
Meanwhile, Nvidia continues to tout its AI storage, the Nvidia Inference Context Memory Storage. The tech firm says this AI-driven storage is designed to store and share data generated by a trillion-parameter, multi-step AI reasoning model.
Claim your free seat in an exclusive crypto trading community - limited to 1,000 members.


