Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
The Maia 200 AI chip is described as an inference powerhouse — meaning it could lead AI models to apply their knowledge to ...
Microsoft unveils Maia 200 AI inference chip using TSMC 3nm, claiming higher FP4/FP8 performance and 30% better $/perf vs ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Microsoft today announced its second generation Maia 200, an AI accelerator processor for datacenters that's optimized for ...
Microsoft is announcing a successor to its first in-house AI chip today, the Maia 200. Built on TSMC’s 3nm process, Microsoft says its Maia 200 AI accelerator “delivers 3 times the FP4 performance of ...
Microsoft has announced Maia 200, a second-generation 3nm AI accelerator designed for inference, boasting significant ...
Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Microsoft has launched its Maia 200 AI chip, aiming to rival Nvidia, Amazon, and Google. This new chip, built on TSMC's ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
With over 100 billion transistors, Maia 200 offers "powerhouse" AI inferencing possibilites, Microsoft says.
Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...