Microsoft launches two data center infrastructure chips to speed AI applications

Microsoft unveiled two new infrastructure chips aimed at enhancing AI operations and data security during its Ignite conference. These custom chips reduce reliance on Intel and Nvidia, optimizing data center performance and energy efficiency.

featured-image

Microsoft has designed two additional infrastructure chips for its data centres that will help speed artificial intelligence operations and increase data security, it said on Tuesday at its Ignite conference. Microsoft has devoted significant resources to develop home-grown silicon for general purpose applications and artificial intelligence. Like rivals Amazon.

com and Google, Microsoft's engineers say there is a performance and price benefit to designing chips that are customized for its needs. Designing custom chips can reduce Microsoft's reliance on processors made by Intel and Nvidia. Microsoft's two new chips are designed to be installed deep within the company's data centre infrastructure.



One chip is designed to increase security and the other is for data processing. The company makes the effort to design an array of data centre processors because it aims to "optimize every layer of infrastructure" and ensures that Microsoft's data centres crunch information at the speed AI requires, said Rani Borkar, corporate vice president, Azure Hardware Systems and Infrastructure Engineers will install the new security chip called the Azure Integrated HSM in every new server destined for a data centre beginning next year. The chip aims to keep crucial encryption and other security data inside the security module.

The data processing unit, or DPU, aims to move multiple components of a server into a single chip that is focused on cloud storage data. The company said it can run these specific tasks at three times less power and four times the performance compared with its current hardware. Microsoft also announced a new version of a cooling system for data center servers that relies on liquid to reduce the temperature of nearby components.

The cooling unit can be used to support large-scale AI systems..