AMD Just Made A Massive Deal With OpenAI – And Caught Nvidia By Surprise
Advanced Micro Devices (AMD) has earned its reputation for innovation in tech and gaming, creating some of the best graphics cards in 2025. But the company's innovation has reached another level as AMD announced a deal with OpenAI, the company behind ChatGPT. The deal involves AMD supplying the AI giant with up to six gigawatts of its Instinct GPUs, beginning with one gigawatt of GPUs in the second half of 2026. It's an enormous amount of processing power, enough to run some of the world's most advanced AI models. AMD is also giving OpenAI up to a 10% stake in the company, which caught Nvidia by surprise.
Nvidia, AMD's top competitor in GPUs, reached its own deal with OpenAI just one month earlier. This new partnership would see Nvidia supply 10 gigawatts, along with a $100 billion investment in OpenAI's infrastructure. During an appearance on CNBC's "Squawk Box" (via YouTube), Nvidia CEO Jensen Huang somewhat downplayed the deal's importance when asked if he had any prior knowledge of it, saying "Our deal is very different than theirs," and later pointing out that Nvidia still controls most of the AI chip market.
Despite Nvidia's public position on the topic, there's no denying that this new deal could change the face of the industry moving forward. It definitely changes things for AMD, as the company's stock jumped a whopping 24% the same day it made the announcement. This was their biggest single-day gain since 2002.
AMD and OpenAI's collaborative journey
AMD's association with OpenAI began in 2023, with the introduction of the MI300X accelerators. Designed to handle large AI workloads, the cutting-edge MI300X had superior memory capabilities. This would allow companies to use even the most complex AI models on a regular basis without slowing down their systems in the process. AMD's intended goal for its collaboration with OpenAI is to allow cloud providers and other partners the ability to perform their functions more efficiently than ever before.
Fast forward to 2024, and Microsoft started using AMD's new GPUs in its Azure ND MI300X virtual machines. These machines powered Azure's OpenAI service, including Chat GPT-3.5 and GPT-4, which both include a free feature that was once behind a paywall. This setup allowed Hugging Face, an open-source platform and AI community, to achieve strong performance without having to rewrite any existing software. In essence, this was a proof of concept for AMD, whose tech was accomplishing its mission.
The combination of the MI300X and AMD's ROCm software proved to be just what companies like Microsoft were looking for. Thanks to the power of AMD's GPUs, Microsoft's AI models didn't need as many to run, ultimately saving the company money. This was an enterprise-level solution that would pave the way for AMD's future, and the future of OpenAI as well.