Microsoft unveils the Maia 200, its 2nd-generation AI accelerator built on TSMC's 3nm process, deploying today in its Azure US Central data center region
The Maia 200 chip is starting to roll out to Microsoft's data centers today. … Microsoft is announcing a successor to its first in-house AI chip today, the Maia 200.
The Verge Tom Warren
Related Coverage
- Microsoft unveils Maia 200 AI chip, claiming performance edge over Amazon and Google GeekWire · Todd Bishop
- Microsoft rolls out next generation of its AI chips, takes aim at Nvidia's software Reuters · Stephen Nellis
- Maia 200: The AI accelerator built for inference Microsoft · Scott Guthrie
- Microsoft says its newest AI chip Maia 200 is 3 times more powerful than Google's TPU and Amazon's Trainium processor LiveScience · Roland Moore-Colyer
- Microsoft announces powerful new chip for AI inference TechCrunch · Lucas Ropek
- Microsoft Unveils Latest AI Chip to Reduce Reliance on Nvidia Bloomberg · Matt Day
- Microsoft reveals second generation of its AI chip in effort to bolster cloud business CNBC · Jordan Novet
- Microsoft unveils Maia 200, its ‘powerhouse’ accelerator looking to unlock the power of large-scale AI TechRadar · Mike Moore
- Microsoft's Maia 200 targets cheaper AI inference The Deep View · Nat Rubio-Licht
- Microsoft unveils Maia 200 accelerator, claiming better performance per dollar than Amazon and Google ITPro · Rory Bathgate
- Microsoft introduces AI accelerator for US Azure customers ComputerWeekly.com · Cliff Saran
- Microsoft launches Maia 200 as custom AI silicon accelerates Constellation Research · Larry Dignan
- Microsoft Unveils Second‑Generation AI Chip to Strengthen Cloud Capabilities Blockonomi · Maxwell Mutuma
- Microsoft takes aim at Google, Amazon, and Nvidia with new AI chip Yahoo Finance · Daniel Howley
- Microsoft Maia 200 The AI Inference Game Changer TwoBitDaVinci on YouTube · TwoBitDaVinci
- Microsoft rolls out Maia 200 AI chip to reduce reliance on Nvidia Crypto Briefing · Estefano Gomez
- Microsoft just announced the Maia 200 AI chip, the first purpose-built chip powering Microsoft's AI future. Stay tuned to hear more about how we're advancing AI infrastructure at scale. Microsoft Azure
- Great momentum in advancing Azure AI infrastructure. — We built Maia 200 to expand our heterogeneous AI infrastructure … Arun Ulag
- Meet Maia 200, Microsoft's next‑generation AI accelerator that helps you run AI workloads 3× faster and get ~30% better performance‑per‑dollar compared to today's systems. … Jessica Hawk
- Microsoft Escalates AI Arms Race With Next Gen Chips Benzinga · Anusuya Lahiri
- Microsoft Announces its Maia 200 AI Accelerator for Datacenters Thurrott · Paul Thurrott
- Microsoft Unveils A New AI Inference Accelerator Chip, Maia 200 Forbes · Ron Schmelzer
- Microsoft's Maia 200 AI chip claims performance lead over Amazon and Google The Decoder · Matthias Bastian
- Microsoft announces Maia 200, its 2nd-gen AI accelerator for cost-efficient inference Neowin · Pradeep Viswanathan
- The infrastructure behind AI matters most when it disappears from view. — That's what makes progress like Microsoft's Maia 200, so important. … Yusuf Mehdi
- Microsoft's new AI silicon is here — is this what will stop OpenAI's $14 billion bonfire? Windows Central · Sean Endicott
- Microsoft Introduces Maia 200, Its Most Powerful AI Chip Yet eWeek · Aminu Abdullahi
- Microsoft Unveils Maia 200 In-House Inference Chip Data Center Knowledge · Shane Snider
- Microsoft's New Maia 200 Chip Steps Up to Make AI Responses Cheaper and Faster TechEBlog · Jackson Chung
- Microsoft takes aim at Google, Amazon, and Nvidia with new AI chip Yahoo Finance · Daniel Howley
- Maia 200 Architecture Overview Microsoft Tech Community · Sdighe
- Microsoft's Maia 200 AI chip launch leaves Nvidia stock little changed Investing.com
- Microsoft Unveils Maia 200, A New Inference Accelerator To Cut AI Token Costs Pulse 2.0 · Amit Chowdhry
- Microsoft unveils Maia 200 AI chip to challenge Nvidia dominance Reuters
- Microsoft Unveils Maia 200 AI Accelerators To Boost Cloud AI Independence HotHardware · Zak Killian
- Microsoft Takes On AWS, Google And Nvidia With Maia 200 AI Chip Launch CRN · Dylan Martin
- Microsoft rolls out next generation of its AI chips, takes aim at Nvidia's software Channel NewsAsia
Discussion
-
@satyanadella
Satya Nadella
on x
Our newest AI accelerator Maia 200 is now online in Azure. Designed for industry-leading inference efficiency, it delivers 30% better performance per dollar than current systems. And with 10+ PFLOPS FP4 throughput, ~5 PFLOPS FP8, and 216GB HBM3e with 7TB/s of memory bandwidth [vi…
-
@mustafasuleyman
Mustafa Suleyman
on x
Our Maia 200 inference chip, announced today, is most performant first party silicon of any hyperscaler. 3x the FP4 performance of the Amazon Trainium v3, and FP8 performance above Google's TPUv7. [image]
-
@azure
@azure
on x
Introducing Maia 200: our next-generation AI accelerator delivering 30% better performance per dollar. Purpose-built for inference at the silicon level.
-
@mustafasuleyman
Mustafa Suleyman
on x
It's a big day. Our Superintelligence team will be the first to use Maia 200 as we develop our frontier AI models.
-
@scottgu
Scott Guthrie
on x
Maia 200 is an AI inference powerhouse. Our most performant first‑party silicon from any hyperscaler, delivering 30% better performance per dollar than the latest hardware in our fleet. Built for efficient large‑scale inference and integrated into Azure.
-
@patrickmoorhead
Patrick Moorhead
on x
“FP4 throughput is now Blackwell-class territory. Microsoft quotes 10+ petaFLOPS FP4 per chip, which puts it in the same conversation as NVIDIA B200 generation inference compute.” $MSFT
-
@ryanshrout
Ryan Shrout
on x
Microsoft just announced the deployment of Maia 200, its next generation custom silicon for AI. Inference economics are still clearly a silicon feature, not just a software problem. @Signal_65 has been working with Microsoft on Maia 200, and we will have more to share in the
-
@stocksavvyshay
Shay Boloor
on x
$MSFT unveiled its Maia 200 AI inference chip built on $TSM 3nm process with deployments starting this week in its U.S. Central data center region. This marks another step toward Microsoft owning more of the AI inference stack end-to-end. [image]
-
@tomwarren.co.uk
Tom Warren
on bluesky
Microsoft is announcing its own Maia 200 AI chip today. It goes head-to-head in performance against Google and Amazon's AI chips, and Microsoft is using Maia 200 to host GPT-5.2 and others for Microsoft Foundry and Microsoft 365 Copilot. Details here 👇 www.theverge.com/news/867…
-
@benbajarin
Ben Bajarin
on x
Strategically optimized for token per dollar per watt of specific workloads. Inference-focused, as the trend will be in custom silicon variants going forward. Interesting, they point out it MAIA 200 is already powering GPT 5.2. https://blogs.microsoft.com/ ...
-
@benitoz
Ben Pouladian
on x
Microsoft's Maia 200 is real: 10 PF FP4, 216GB HBM3e, 7 TB/s. But NVIDIA's Vera Rubin (H2 2026): 50 PF FP4, 288GB HBM4, 13 TB/s. Hyperscalers build for today's inference costs. NVIDIA builds for tomorrow's ceiling. $NVDA $msft [image]
-
@jamesaltonsanders.com
James Sanders
on bluesky
The press photos of this chip appear to be a mockup, not a genuine sample of the Maia 200. — It could be very close to the real thing, but it's not typical to use physical mockups for press. — Renders? Yes. Real chips? Yes. — Mockups? Not often. [embedded post]
-
r/microsoft
r
on reddit
Maia 200: The AI accelerator built for inference - The Official Microsoft Blog
-
@theaustinlyons
Austin Lyons
on x
“Right systems for right workloads.” Maia 200 is a neat example of thoughtful design decisions for inference at scale. I chatted with the Maia team, and they talked about working backward from customer workloads to arrive at an inference chip that deliberately isn't a GPU.
-
@unusual_whales
@unusual_whales
on x
BREAKING: Microsoft, $MSFT unveils its second generation AI chip, the MAIA 200 AI inference chip, built on TSMC's 3nm process, per Bloomberg.
-
@kobeissiletter
@kobeissiletter
on x
BREAKING: Microsoft, $MSFT, announces the launch of its Maia 200 AI chip to “reduce reliance on Nvidia.” The chip is being produced by Taiwan Semiconductor Manufacturing Co., $TSM, and is being launched in Microsoft data centers in Iowa.