Microsoft has officially launched two self-developed artificial intelligence chips. At the Lgnite conference on Wednesday local time in the United States, the company stated that the chip called Maia 100 is the first chip in the Azure Maia AI accelerator series. In addition to the Maia 100, this technology giant has also launched its first custom ARM based Azure Cobalt central processor for general cloud computing.
With these two chips, Microsoft can now be on par with its competitors Google and Amazon, which have also developed self-developed custom chips to run corresponding cloud platforms. Microsoft stated that its self-developed chips will be used for cloud based training and inference of artificial intelligence models. Training is the process by which a company establishes an artificial intelligence model, while inference is the process by which the company launches the model for practical use.
Software is our core advantage, but to be honest, we are a systems company, “Rani Borkar, Vice President of Microsoft Azure Hardware Systems and Infrastructure, said in a statement.
At Microsoft, we are working together to design and optimize hardware and software so that one plus one is greater than two. We can understand the entire stack, and chips are just one of its components.
Microsoft stated that the services supported by its self-developed custom chips can bring “powerful performance and efficiency benefits”.
If you have software under Microsoft, it will run better on chips designed by Microsoft. This is because Microsoft can customize its software and hardware to provide better performance.
This is the same reason why Nvidia provides its own artificial intelligence software in addition to artificial intelligence chips, or why Apple develops its own chips for the iPhone or Mac. If a company can control both hardware and software, it can provide users with a better experience.
Microsoft has stated that it is collaborating with ChatGPT developer OpenAI to test its Maia 100 chip and will use this experience to build future chips.
OpenAI CEO Sam Altman said in a statement: “When Microsoft first shared the design of its Maia chip, we were very excited and worked together to improve and test it with our model.” “Azure’s end-to-end AI architecture has now been optimized to chips through Maia, paving the way for training more powerful models and allowing our customers to obtain cheaper ‘models’.”
Microsoft stated that in addition to building chips, it also built server motherboards with chips and the server architecture in which they are located.
As for the Azure Cobalt chip, Microsoft stated that its performance has improved by 40% compared to the current generation of ARM based Azure chips.
Scott Guthrie, Executive Vice President of Microsoft Cloud and Artificial Intelligence, said, “In terms of our operational scale, optimizing and integrating each layer of the infrastructure stack to maximize performance, achieve supply chain diversification, and provide customers with infrastructure choices is crucial.
Meanwhile, the launch of Microsoft’s own chips does not necessarily mean giving up cooperation with NVIDIA or AMD. Microsoft is still providing cloud computing capabilities to run the Nvidia H100 chip and has added access to the H200 chip. Microsoft stated that it will also begin providing access to AMD’s MI300 chip next year.