AMD is making waves! News from June 12, 2025, reveals AMD’s Instinct MI400 series AI chips, designed to challenge Nvidia’s dominance. The next-generation chips aim to redefine the AI chip market.OpenAI’s Sam Altman, who plans to use the new tech, appeared at the launch event alongside AMD’s CEO, lisa Su. The “Helios” rack-scale system, comprised of MI400 chips, aims to serve hyperscale AI clusters for companies like cloud providers. AMD is already working with partners, including Meta and Microsoft, while simultaneously aiming to compete on both price and performance. With an eye on the future,AMD hopes to outperform their competitors. Stay ahead of the curve and find out the latest thru News Directory 3. Discover what’s next …
AMD Unveils Next-Gen AI Chips to Challenge Nvidia’s Dominance
Updated June 12, 2025
Advanced Micro Devices (AMD) has revealed details about its upcoming Instinct MI400 series AI chips, slated for release next year. These next-generation AI chips are designed to compete directly with Nvidia in the rapidly expanding artificial intelligence market.

The MI400 chips can be assembled into a full server rack, known as Helios, enabling thousands of chips to function as a unified ”rack-scale” system, according to AMD. CEO Lisa Su highlighted the unified system architecture at a launch event in San Jose, California.
OpenAI CEO Sam Altman, who appeared alongside Su, confirmed that his company would utilize the AMD chips. Altman expressed initial skepticism about the chip specifications but acknowledged their potential impact.
AMD’s rack-scale design presents the chips as a single system, crucial for AI customers like cloud providers and large language model developers who require hyperscale AI clusters spanning entire data centers.
Su compared Helios to Nvidia’s Vera Rubin racks, also expected next year, emphasizing that Helios functions as a massive compute engine.

This technology positions AMD’s chips to rival Nvidia’s Blackwell chips, which already integrate 72 graphics-processing units. AMD aims to undercut Nvidia with aggressive pricing and lower operational costs, according to Andrew Dieckmann, AMD’s general manager for data center GPUs.
while nvidia currently dominates the data center GPU market, AMD is focusing on open software frameworks and hardware improvements to gain ground. Su noted that AMD’s MI355X outperforms Nvidia’s Blackwell chips, even with Nvidia’s proprietary CUDA software.
AMD has invested in 25 AI companies in the past year, including the acquisition of ZT systems, to enhance its full-stack solutions for complex AI systems, Su said.
The current flagship AMD AI chip,the Instinct MI355X,began shipping last month and will be available for rent from cloud providers in the third quarter. AMD believes its new chips offer superior inference capabilities compared to Nvidia’s, due to higher high-speed memory capacity.
Meta representatives have stated they are using AMD CPUs and GPUs for inference in their Llama model and plan to purchase AMD’s next-generation servers. Microsoft also utilizes AMD chips for its Copilot AI features.
AMD claims its MI355X delivers 40% more tokens per dollar than Nvidia’s chips due to lower power consumption. The company anticipates the AI chip market exceeding $500 billion by 2028.
What’s next
Looking ahead,both AMD and Nvidia are committed to annual AI chip releases,highlighting the intense competition and the critical importance of cutting-edge AI technology for major tech companies.
