DMR News

Advancing Digital Conversations

Microsoft Wants to Mainly Use Its Own AI Data Center Chips in the Future

ByHilary Ong

Oct 5, 2025

Microsoft Wants to Mainly Use Its Own AI Data Center Chips in the Future

Microsoft would like to mainly use its own chips in its data centers in the future, a move that could reduce its reliance on major players like Nvidia and AMD. Kevin Scott, Microsoft’s chief technology officer, laid out the company’s strategy for AI chips during a fireside chat at Italian Tech Week. Scott said that while the company is not “religious” about where its chips come from, it has for years used Nvidia and AMD silicon because they offered the “best price performance.” However, when asked if the longer-term plan is to have mainly Microsoft chips in the firm’s own data centers, Scott responded, “Absolutely,” adding that the company is using “lots of Microsoft” silicon right now.

The Strategy Behind Custom Chips

Microsoft has been designing its own custom chips for data centers to address a massive shortage in computing capacity. In 2023, the company launched the Azure Maia AI Accelerator, which is designed for AI workloads, as well as the Cobalt CPU, a chip tailored for general-purpose computing. The company is reportedly working on its next-generation of semiconductor products. Last week, the company also unveiled new cooling technology using “microfluids” to solve the issue of overheating chips. Scott said the focus is on designing an entire system, including networking and cooling, to have the “freedom to make the decisions that you need to make in order to really optimize your compute to the workload.” Like its rivals Google and Amazon, Microsoft is designing its own chips to not only reduce its reliance on Nvidia and AMD but also to make its products more efficient for its specific requirements.

A Global Race for AI Infrastructure

Scott flagged that there is still a massive shortage of computing capacity across the industry. “A massive crunch [in compute] is probably an understatement,” he said. “I think we have been in a mode where it’s been almost impossible to build capacity fast enough since ChatGPT … launched.” Microsoft and its rivals, including Meta and Alphabet, have committed to more than $300 billion in capital expenditures this year, with a large portion focused on AI investments. Despite deploying an incredible amount of capacity over the past year, Scott warned that demand is still outpacing supply.

Author’s Opinion

Microsoft’s ambition to move toward its own in-house chips signals a new and crucial phase in the AI race. This is a strategic move for a company that has, for years, relied on others for its most critical components. By vertically integrating and designing its own chips, Microsoft can tailor the hardware specifically for its AI models, potentially achieving efficiency gains that are impossible with off-the-shelf components. This is a bold bet that, if successful, could not only reduce its reliance on key suppliers but also give it a significant competitive advantage over its rivals. The future of AI is not just in the software, but in the silicon that powers it, and Microsoft seems determined to own that future.


Featured image credit: Wikimedia Commons

For more stories like it, click the +Follow button at the top of this page to follow us.

Hilary Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *