Microsoft unveiled two custom-designed chips and integrated systems for its Azure hyperscale data centers. The new silicon represents the latest steps from Microsoft to develop core infrastructure in-house, extending from silicon, accelerator boards, custom server boards, liquid-cooled data center racks, fiber cabling, renewable power facilities, and even subsea cable systems.
Highlights of infrastructure announcements from the Microsoft Ignite event in Seattle.
- Microsoft Azure Maia, an AI Accelerator chip designed to run cloud-based training and inferencing for AI workloads such as OpenAI models, Bing, GitHub Copilot and ChatGPT. Maia 100, the first generation in the series, boasts 105 billion transistors and is built on 5nm process technology.
- Microsoft Azure Cobalt, a cloud-native chip based on Arm architecture optimized for performance, power efficiency and cost-effectiveness for general purpose workloads. Cobalt 100, the inaugural model in the series, is a 64-bit, 128-core chip. It delivers up to a 40 percent performance improvement over current Azure Arm chip generations and powers services like Microsoft Teams and Azure SQL.
- Microsoft Azure Boost, a system that makes storage and networking faster by moving those processes off the host servers onto purpose-built hardware and software. The company cites remote storage acceleration up to 12.5 Gbps throughput with 650K IOPs. Additionally, Azure Boost introduces Microsoft’s proprietary programmable networking interface: MANA (Microsoft Azure Network Adapter). Using MANA allows Azure VM customers to achieve up to 200Gbps networking throughput on select VM sizes.
- AMD MI300X accelerated virtual machines (VMs) are now available on Azure. The ND MI300 VMs are designed to accelerate the processing of AI workloads for high range AI model training and generative inferencing, and will feature AMD’s latest GPU, the AMD Instinct MI300X.
- NVIDIA NC H100 v5 Virtual Machine Series built for NVIDIA H100 Tensor Core GPUs are in preview, offering greater performance, reliability and efficiency for mid-range AI training and generative AI inferencing.
- NVIDIA ND H200 v5 Virtual Machine Series in development, an AI-optimized VM featuring the upcoming NVIDIA H200 Tensor Core GPU.
- Microsoft is manufacturing its own hollow core fiber. This follows its acquisition of UK-based Lumenisity in December 2022. Lumenisity was formed in 2017 as a spinoff from the world-renowned Optoelectronics Research Centre (ORC) at the University of Southampton
- Microsoft now has 60+ Azure regions and 300+ data centers worldwide
https://news.microsoft.com/source/features/ai/in-house-chips-silicon-to-service-to-meet-ai-demand/