News

[News] Microsoft Reportedly Stockpiling AI Chips to Boost Hardware Capability


2024-04-19 Semiconductors editor

According to sources cited by the American news outlet Business Insider, Microsoft plans to double its inventory of GPUs to 1.8 million, primarily sourced from NVIDIA. Having more chips on hand will enable Microsoft to launch AI products that are more efficient, faster, and more cost-effective.

The source does not detail specific future applications for these chips, but acquiring a large quantity of chips means that Microsoft can extensively deploy them across its own products, including cloud services and consumer electronics.

The sources cited by the same report further revealed that Microsoft plans to invest USD 100 billion in GPUs and data centers by 2027 to strengthen its existing infrastructure.

Microsoft’s significant stockpiling of AI chips underscores the company’s efforts to maintain a competitive edge in the AI field, where having robust computing power is crucial for innovation.

On the other hand, NVIDIA recently stated that the AI computer they are collaborating on with Microsoft will operate on Microsoft’s Azure cloud platform and will utilize tens of thousands of NVIDIA GPUs, including their H100 and A100 chips.

NVIDIA declined to disclose the contract value of this collaboration. However, industry sources cited by the report estimate that the price of each A100 chip ranges from USD 10,000 to 12,000, while the price of the H100 is significantly higher than this range.

Additionally, Microsoft is also in the process of designing the next generation of the chip. Not only is Microsoft striving to reduce its reliance on NVIDIA, but other companies including OpenAI, Tesla, Google, Amazon, and Meta are also investing in developing their own AI accelerator chips. These companies are expected to compete with NVIDIA’s flagship H100 AI accelerator chips.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from Business Insider and NVIDIA.

Get in touch with us