DMR News

Advancing Digital Conversations

OpenAI Signs Multi Year Compute Deal With Cerebras Valued Above $10 Billion

ByJolyen

Jan 15, 2026

OpenAI Signs Multi Year Compute Deal With Cerebras Valued Above $10 Billion

OpenAI has entered a multi year agreement with AI chipmaker Cerebras that will provide hundreds of megawatts of compute capacity to support faster AI responses, marking one of the largest infrastructure deals tied to real time AI inference.

Details Of The Compute Agreement

OpenAI announced on Wednesday that it reached a multi year agreement with Cerebras. Under the deal, Cerebras will deliver 750 megawatts of compute capacity beginning in 2026 and continuing through 2028, according to the chipmaker.

Focus On Faster AI Responses

Both companies said the agreement is aimed at improving response times for OpenAI’s products. In a blog post, OpenAI said the Cerebras systems will accelerate responses that currently take longer to process.

Andrew Feldman, co-founder and chief executive of Cerebras, said in a statement that real time inference represents a major shift in how AI systems operate, comparing its impact on AI to the way broadband changed internet usage.

Cerebras Position In The AI Market

Cerebras has operated for more than a decade, but its profile rose sharply after the launch of ChatGPT in 2022 and the broader surge in demand for AI infrastructure. The company says its systems, which are built around chips designed specifically for AI workloads, outperform traditional GPU based systems, including those offered by Nvidia.

The chipmaker filed for an initial public offering in 2024 but has delayed the process multiple times. During that period, Cerebras has continued to raise capital. On Tuesday, reports said the company was in discussions to raise an additional $1 billion at a valuation of $22 billion.

Existing Ties Between The Companies

OpenAI chief executive Sam Altman is already an investor in Cerebras, and OpenAI previously considered acquiring the company, according to people familiar with earlier discussions.

In OpenAI’s announcement, Sachin Katti said the company’s compute strategy centers on matching specific systems to specific workloads. He said Cerebras adds a low latency inference option that supports faster responses and more natural interactions as OpenAI scales real time AI services.


Featured image credits: Wikimedia Commons

For more stories like it, click the +Follow button at the top of this page to follow us.

Jolyen

As a news editor, I bring stories to life through clear, impactful, and authentic writing. I believe every brand has something worth sharing. My job is to make sure it’s heard. With an eye for detail and a heart for storytelling, I shape messages that truly connect.

Leave a Reply

Your email address will not be published. Required fields are marked *