Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
As artificial intelligence shifts from experimental demos to everyday products, the real pressure point is no longer training ...
On Friday, OpenAI engineer Michael Bolin published a detailed technical breakdown of how the company’s Codex CLI coding agent ...
Microsoft says the new chip is competitive against in-house solutions from Google and Amazon, but stops short of comparing to ...
Inference-optimized chip 30% cheaper than any other AI silicon on the market today, Azure's Scott Guthrie claims Microsoft on ...
Scott Guthrie, the company's Executive Vice President, said that Maia 200 has "30% better performance per dollar" than current-generation technology. This makes it the company's most economical AI ...
Microsoft unveils Maia 200, a custom AI chip designed to power Copilot and Azure, challenging Amazon and Google in the ...