At the Microsoft Ignite 2023 event, Microsoft announced the addition of two custom-designed chips to its Azure cloud service platform. These are the Azure Maia AI accelerator, which is optimized for AI workload execution and automatically generated AI applications, and the Azure Cobalt CPU, which is designed based on the Arm architecture and is targeted at general cloud computing needs. Both are built using TSMC's 5nm process.

In collaboration with OpenAI, the company will provide optimization feedback for the Azure Maia AI accelerator's operating modes, enabling it to shorten AI model training time and accelerate the deployment of AI applications. Furthermore, the Azure Cobalt CPU, built on the Arm architecture, will further reduce power consumption and significantly improve server performance.
With Microsoft introducing custom-designed chips into its Azure cloud services, it is expected to enable them to operate more efficiently and significantly reduce overall power consumption. Microsoft also emphasized the creation of a dedicated rack design for the Azure Maia AI accelerator, along with a dedicated water cooling system to more efficiently dissipate heat generated during operation.

In order to further promote the performance of its cloud services, Microsoft also announced the launch ofNVIDIA H100 Tensor Core GPUThe NC H100 v5 virtual machine series service, designed and available in preview form, allows users to more easily obtain computing performance and application flexibility through the Azure cloud service platform.
In addition, Microsoft also stated that it will increaseNVIDIA H200 Tensor Core GPUThe new specification options provide higher computing performance and shorter execution response time, while also supporting the computing needs of large-scale natural language models, allowing more businesses to obtain higher computing resources through the Azure service platform.
Microsoft also announced a partnership with AMD, which is expected to be built with CDNA 3 acceleration architecture and equipped with up to 192GB HBM3 memory design.Instinct MI300X GPUBy joining the Azure cloud service platform and using virtual machines to meet higher AI computing execution speeds and larger AI computing model training and inference requirements, users can obtain higher computing power through the cloud at an affordable price.


