Tag: Instinct MI350

AMD's data center GPUs are updated annually to compete with NVIDIA: the Instinct MI400 series will debut in 2026, and the MI500 is expected to take over in 2027.

AMD Reveals Long-Term Development Strategy: Targeting the Trillion-Dollar Computing Market and Aiming for 50% Market Share in Server CPU Revenue.

Earlier at its Financial Analyst Day 2025, AMD unveiled its long-term strategy and product technology roadmap. AMD Chairman and CEO Lisa Su emphasized that the company is entering a new era of growth driven by its technology roadmap and AI development momentum, and has set ambitious long-term financial goals: to achieve a compound annual growth rate (CAGR) of over 35% in revenue and non-GAAP earnings per share (EPS) of over $20 within the next 3 to 5 years. Data Center: MI450 to Debut in 2026, MI500 to Take Over in 2027. In the highly anticipated AI data center sector, AMD confirmed its product line update schedule: • Instinct MI350 Series: Referred to as the fastest-growing product in AMD's history, it has already been deployed on a large scale by companies such as Oracle Cloud (OCI). • Instinct MI450 series (Helios systems): Expected to launch in Q3 2026, it will deliver rack-class performance with leading memory capacity and extended bandwidth. • Instinct MI500 series: Expected to launch in 2027, further expanding the AI ​​performance roadmap. In terms of server CPUs, the new generation "Venice" CPU will offer superior performance and energy efficiency to support the needs of AI and general infrastructure. AMD expects to achieve over 50% market share in server CPU revenue, leveraging the advantages of its EPYC processors. In the data center AI field, the goal is to drive a CAGR of over 80% in revenue. ...

AMD showcases comprehensive AI solutions at AI Solutions Day, including 5th Gen EPYC processors and the Instinct MI350 GPU.

AMD showcases comprehensive AI solutions at AI Solutions Day, including 5th Gen EPYC processors and the Instinct MI350 GPU.

AMD held its "2025 AMD AI SOLUTIONS DAY" in Taipei today (September 4th), themed "Starting with Computing Power, Leading the Infinite Possibilities of AI." Partnering with over 30 OEMs/ODMs, ISVs, and embedded partners, AMD showcased its complete product portfolio, encompassing the 5th generation EPYC server processors, Instinct MI350 GPUs, Pensando AI NICs, and Ryzen AI PCs. Through open architecture and cross-domain collaboration, AMD aims to accelerate the application of AI technology in various industries, expanding its AI computing footprint from data centers to the edge and personal devices. ▲AMD held "2025 AMD AI SOLUTIONS DAY" in Taipei. Lin Chien-cheng, Senior Vice President of Commercial Business at AMD Taiwan, pointed out that AI development is rapidly entering a new phase, from the training and inference of large-scale models to the rise of AI agents, placing higher demands on computing power efficiency and platform flexibility. AMD leverages its product line, including EPYC CPUs, Instinct GPUs, Pensando DPUs, Ryzen AI CPUs, Radeon AI GPUs, and Versal SoCs, to create a complete end-to-end AI computing solution. Furthermore, by stacking the latest version of ROCm 7 open-source software, it further optimizes generative AI and high-performance computing needs.

AMD officially launched the Instinct MI4 series, which uses the CDNA 350 architecture design and directly challenges NVIDIA in AI inference.

AMD officially launched the Instinct MI4 series, which uses the CDNA 350 architecture design and directly challenges NVIDIA in AI inference.

At the Advancing AI 2025 event in San Jose, California, AMD officially announced the launch of its new generation of high-performance computing platform accelerators, the Instinct MI350 series, featuring a CDNA 4 architecture and up to 288GB of HBM3e high-bandwidth memory. It boasts a 35x improvement in AI inference performance (compared to the Instinct MI300 series). AMD also reiterated that the Instinct MI400 series accelerators will be launched in 2026, featuring the next-generation AI architecture codenamed "Helios". ▲The Instinct MI350 series, featuring a CDNA 4 architecture, competes with NVIDIA in accelerating AI inference. The Instinct MI350 series is divided into the Instinct MI355X and Instinct MI350X designs. Both adopt the CDNA 4 architecture and are equipped with 288GB HBM3e high-bandwidth memory provided by Micron and Samsung, supporting a data transfer rate of 8TB per second. However, they differ in some aspects. For example, the Instinct MI355X can achieve a peak computing power of 79TFLOPS in FP64 and 20PFLOPS in FP4, while the Instinct MI350X can achieve 72TFLOPS and 18.4PFLOPS respectively. ▲The Instinct MI350 series is differentiated by the Instinct MI355X and Instinct MI350X designs. Both adopt the CDNA 4 architecture and are equipped with 288GB HBM3e high-bandwidth memory supplied by Micron and Samsung, supporting data transfer rates of 8TB per second. However, they differ in some aspects. ▲The Instinct MI350 series uses an OAM carrier board, with a single carrier board capable of configuring up to 8 accelerators. Additionally, the Instinct...

AMD officially launched its fifth-generation EPYC server processor, codenamed Turin, and simultaneously launched an AI accelerator to compete with NVIDIA H200.

AMD officially launched its fifth-generation EPYC server processor, codenamed Turin, and simultaneously launched an AI accelerator to compete with NVIDIA H200.

Following the announcement earlier this year of the fifth-generation EPYC server processor codenamed Turin (after Turin, a major industrial city in northern Italy), AMD officially confirmed that the processor has begun shipping and has been adopted by ODM companies and cloud service providers such as Cisco, Dell, HPE, Lenovo, and Supermicro. Simultaneously, AMD also launched the Instinct MI325X accelerator based on the CDNA 3 architecture, thereby promoting high-performance and optimized artificial intelligence solutions. The fifth-generation EPYC server processor offers up to 37% improvement in instruction execution performance per cycle for artificial intelligence and high-efficiency computing. Named the 9005 series, the fifth-generation EPYC server processor is also built on the Zen 5 architecture and is compatible with the existing SP5 socket platform, offering a wide range of core count options from 8 to 192 cores. It emphasizes a balance between performance and energy efficiency, with the highest-end 192-core processor offering up to 2.7 times the performance of competing products. Compared to the earlier Zen 4 architecture products, the fifth-generation EPYC server processors utilizing the Zen 5 architecture offer up to a 17% improvement in instruction execution per cycle (IPC) for enterprise and cloud workloads, and up to a 37% improvement in IPC for artificial intelligence and high-performance computing. For example, the newly launched 192-core EPYC 9965 processor offers up to 4x faster video transcoding speeds in business applications such as Intel's Xeon 8592+, and up to 3.9x faster insight times in scientific and high-performance computing applications. Furthermore, it delivers up to 1.6x better per-core performance in virtualized infrastructure. In end-to-end AI workloads such as TPCx-AI (derived from this architecture), the EPYC 9965 processor delivers up to 3.7x better performance, and in small to medium-sized enterprise-level generative AI models such as Meta Llama 3.1-8B, it offers 1.9x better data throughput performance than comparable products from competitors. The newly added 64-core EPYC 9575F processor is designed for AI solutions requiring extreme host CPU performance and GPU acceleration. Its clock speed is increased by 5GHz, compared to 3.8GHz for competing products, representing a 28% speed improvement. This allows the GPU to meet the demanding data processing requirements of AI workloads, enabling a 1000-node AI computing cluster to drive over 700,000 inference tokens per second, thus completing multiple tasks faster. Other features include the 9005 series of fifth-generation EPYC server processors, which will also offer derivatives based on the Zen 5c architecture. Each CPU can support up to 12 channels of DDR5 memory modules, up to DDR5-6400 MT/s. It also supports the full 512b data path of the AVX-512 instruction set, uses trusted I/O ports for confidential operations, and is undergoing FIPS certification for each part of the series to ensure system security. The Instinct MI325X accelerator, launched simultaneously, boasts 1.8 times the memory capacity compared to the H200 accelerator. It utilizes HBM3E high-bandwidth memory with up to 256GB capacity and a transfer rate of 6.0TB/s, emphasizing a 1.8x increase in memory capacity and a 1.3x increase in data transfer bandwidth compared to NVIDIA's H200 accelerator. Furthermore, it features Mistral...

Welcome back!

Login to your account below

Retrieve your password

Hãy nhập tên người dùng hoặc địa chỉ email để mở mật khẩu