Microsoft unveiled a pair of tailor-made computing chips on Wednesday, following suit with other major tech companies aiming to internalize core technologies to offset the high expenses involved in delivering artificial intelligence (AI) services.
The company stated its intention not to market these chips but instead employ them to empower its subscription software offerings and fortify its Azure cloud computing service.
At the Ignite developer conference in Seattle, Microsoft introduced a new chip named Maia, geared to accelerate AI computing tasks. It serves as the foundation for its $30-a-month (€28) “Copilot” service for business software users and caters to developers seeking to create personalized AI services.
The Maia chip’s focus is running extensive language models, integral to Microsoft’s Azure OpenAI service and a result of Microsoft’s partnership with OpenAI, the creators of ChatGPT.
With the notably higher costs of delivering AI services compared to conventional services like search engines, Microsoft aims to centralize its AI efforts by leveraging a set of foundational AI models, for which the Maia chip is optimized.
Scott Guthrie, Microsoft’s executive vice president of cloud and AI, remarked, “We think this gives us a way that we can provide better solutions to our customers that are faster and lower cost and higher quality.”
Microsoft plans to offer Azure customers cloud services based on the latest flagship chips from Nvidia and Advanced Micro Devices next year. However, it clarified that the Maia chip is not a direct competitor to Nvidia, focusing instead on enhancing AI services in the cloud until personal devices reach a performance level to handle such tasks.
The second chip, Cobalt, is positioned as an internal cost-saving measure and a response to Amazon Web Services, Microsoft’s primary cloud competitor. It’s a central processing unit (CPU) incorporating technology from Arm Holdings. While currently in use for Teams, Microsoft’s business messaging tool, the company also intends to sell Cobalt’s direct access to compete with Amazon’s “Graviton” series of in-house chips.
“We are designing our Cobalt solution to ensure that we are very competitive both in terms of performance as well as price-to-performance (compared with Amazon’s chips),” Guthrie emphasized.
Although Microsoft provided limited technical specifics for comparison, the Maia and Cobalt chips utilize 5-nanometer manufacturing technology from Taiwan Semiconductor Manufacturing Co. According to Rani Borkar, corporate vice president for Azure hardware systems and infrastructure, these chips will employ standard Ethernet network cabling, veering away from Nvidia’s pricier custom networking technology used in their OpenAI-built supercomputers.
“You will see us going a lot more the standardisation route,” Borkar stated.