Microsoft this week rolled out its first production run of its homegrown AI chips in one of its data centers, with plans to roll out more in the coming months. he says.
The chip, which is called the Maia 200, is designed to be what Microsoft calls an “electronic AI inference center,” meaning it’s optimized for the compute-intensive task of running artificial intelligence models in production. The company has released some impressive processing speed specs for Maia, saying it outperforms Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPUs).
All the cloud giants are turning to their own AI chip designs, in part because of the difficulty and cost of getting the latest and greatest from Nvidia – a supply difficulty that shows no signs of abating.
But even with its own cutting-edge, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still buy chips made by others.
“We have a great partnership with Nvidia, with AMD. They innovate. We innovate,” he explained. “I think a lot of people just talk about who’s ahead. Just remember, you’ve got to be ahead forever.”
He added: “Just because we can vertically integrate doesn’t mean we only vertically integrate,” that is, building its own systems from the top down, without using products from other vendors.
That said, the Maia 200 will be used by Microsoft’s own so-called Superintelligence team, the artificial intelligence experts who build the software giant’s frontier models. This is according to Mustafa Suleimanthe former Google DeepMind co-founder who now leads the team. Microsoft is working on its own models to perhaps one day reduce its reliance on OpenAI, Anthropic, and other model makers.
Techcrunch event
Boston, MA
|
June 23, 2026
The Maia 200 chip will also support OpenAI models running on Microsoft’s Azure cloud platform, the company says. But by all accounts, securing access to the most advanced AI hardware is still a challenge for everyone, paying customers and internal teams alike.
So in a post on XSuleiman was clearly delighted to share the news of his team taking first place. “It’s a big day,” he wrote when the chip was released. “Our Superintelligence team will be the first to use the Maia 200 as we develop our frontier AI models.”
