In the race to offer new and improved generative AI services, tech companies need new CPUs to help power the growing number of servers needed to power these functions. NVIDIA has been, without a doubt, the leader in providing these kinds of CPUs to these companies, including Microsoft, and it's been making a ton of money as a result.
Today, one of NVIDIA's biggest rivals, AMD, announced its plans to offer the latest version of its own generative AI chips with the introduction of the AMD Instinct MI300 Series. Microsoft was among the companies that said it would use versions of AMD's new chips for its services. Specifically, Microsoft said it would use the AMD Instinct MI300X in its Azure ND MI300x v5 Virtual Machine servers.
AMD stated the new Instinct MI300X uses its CDNA 3 architecture. It compared its performance to that of the NVIDIA H100 GPUs which have been used at many companies for AI services, including Microsoft:
Compared to the Nvidia H100 HGX, the AMD Instinct Platform can offer a throughput increase of up to 1.6x when running inference on LLMs like BLOOM 176B4 and is the only option on the market capable of running inference for a 70B parameter model, like Llama2, on a single MI300X accelerator; simplifying enterprise-class LLM deployments and enabling outstanding TCO.
AMD added that Meta and Oracle will be among the companies that will use the Instinct MI300X chips for their own generative AI services.
This is just the latest example of the new push to supply the massive amount of demand for AI-based GPUs. Indeed, Microsoft announced a few weeks ago at Ignite 2023 that it would make its own in house AI chip, the Azure Maia AI Accelerator. It will be used to help power Microsoft Copilot and Azure OpenAI Services sometime in early 2024.
2 Comments - Add comment