d-Matrix Raises $275 Million to Power the Age of AI Inference
2025-11-12 Time: 19:01 Source: PR Newswire

Series C led by global consortium values company at $2 billion, accelerates product and customer expansion as demand grows for faster, more efficient data center  inference

SANTA CLARA, Calif., Nov. 12, 2025 /PRNewswire/ -- d-Matrix, the pioneer in generative AI inference compute for data centers, has closed $275 million in Series C funding, valuing the company at $2 billion and bringing the total raised to date to $450 million. The new capital will advance the company's roadmap, accelerate global expansion and support multiple large-scale deployments of the world's highest performing, most efficient data center inference platform for hyperscale, enterprise, and sovereign customers.

DMatrix_Logo

The oversubscribed round attracted leading investment firms across Europe, North America, Asia, and the Middle East. The funding is co-led by a global consortium including BullhoundCapital, Triatomic Capital, and Temasek. The round also includes new participation from the Qatar Investment Authority (QIA) and EDBI, alongside follow-on participation from M12, Microsoft's Venture Fund, as well as Nautilus Venture Partners, Industry Ventures, and Mirae Asset.

d-Matrix's full-stack inference platform combines breakthrough compute-memory integration, high-speed networking, and inference-optimized software to deliver 10× faster performance, 3× lower cost, and 3–5× better energy efficiency than GPU-based systems. Solutions powered by d-Matrix's Corsair™ inference accelerators, JetStream™ NICs and Aviator™ software can produce up to 30K tokens per second at 2ms per token on a Llama 70B model. The platform's compute-dense design allows customers to run up to 100B-parameter models incredibly fast in a single rack.

This step-change in performance and efficiency directly addresses growing AI sustainability challenges. By enabling one data center to handle the workload of ten, d-Matrix offers a clear path to reducing global data center energy consumption while enabling enterprises to deliver cost-efficient, profitable AI services without compromise.

"From day one, d-Matrix has been uniquely focused on inference. When we started d-Matrix six years ago, training was seen as AI's biggest challenge, but we knew that a new set of challenges would be coming soon," said Sid Sheth, CEO and co-founder of d-Matrix. "We predicted that when trained models needed to run continuously at scale, the infrastructure wouldn't be ready. We've spent the last six years building the solution: a fundamentally new architecture that enables AI to operate everywhere, all the time. This funding validates that vision as the industry enters the Age of AI Inference."

Investor confidence reflects d-Matrix's differentiated technology, rapid customer growth, and expanding network of global partners — including the recently announced d-Matrix SquadRack™ open standards-based reference architecture with Arista, Broadcom, and Supermicro. A strong product roadmap featuring 3D memory-stacking innovations and a customer-centric go-to-market strategy further establishes d-Matrix as a cornerstone of the new AI infrastructure stack.

Investor Voices

"As the AI industry's focus shifts from training to large-scale inference, the winners will be those who anticipated this transition early and built for it," said Per Roman, Founder of BullhoundCapital. "d-Matrix stands out not only for its technical depth but for its clear strategic vision. The team understood before anyone else that inference would define the economics of AI — and they're executing brilliantly on that insight."

"AI inference is becoming the dominant cost in production AI systems, and d-Matrix has cracked the code on delivering both performance and sustainable economics at scale," said Jeff Huber, General Partner at Triatomic Capital. "Their digital in-memory compute architecture is purpose-built for low-latency, high-throughput inference workloads that matter most. With Sid, Sudeep, and their world-class team, plus an exceptional ecosystem of partners, d-Matrix is redefining what's economically possible in AI infrastructure."

"The explosion in AI inference demand shows us that efficiency and scalability can be key contributors to revenue capture and profitability for hyperscalers and AI factories," said Michael Stewart, Managing Partner at M12, Microsoft's Venture Fund. "d-Matrix is the first AI chip startup to address contemporary unit economics in LLM inference for models of a range of sizes that are growing the fastest, with differentiated elements in the in-memory product architecture that will sustain the TCO benefits with leading latency and throughput."

Morgan Stanley served as the exclusive placement agent, and Wilson Sonsini Goodrich & Rosati served as legal counsel to d-Matrix.

Key Facts

  • Founded: 2019 | HQ: Santa Clara, CA
  • Global Offices: Toronto (Canada); Sydney (Australia); Bangalore (India); Belgrade (Serbia)
  • Founders: Sid Sheth (CEO), Sudeep Bhoja (CTO)
  • Core Products: Corsair inference accelerators, JetStream networking accelerators, Aviator software stack
  • Employees: 250+ worldwide
  • Series C Funding: $275 million | Total Funding: $450 million | Valuation: $2B

About d-Matrix

d-Matrix is pioneering accelerated computing for AI inference, breaking through the limits of latency, cost and energy. Its Corsair accelerators, JetStream networking, and Aviator software deliver fast, sustainable AI inference at data center scale.

The terms d-Matrix, JetStream, Corsair and Aviator are trademarks and/or registered trademarks of d-Matrix, Inc. in the U.S. and other countries. All rights reserved.

Cision View original content:https://www.prnewswire.com/apac/news-releases/d-matrix-raises-275-million-to-power-the-age-of-ai-inference-302612637.html

SOURCE d-Matrix