spot_img
HomeNews & Current Eventsd-Matrix Secures $275 Million in Series C Funding to...

d-Matrix Secures $275 Million in Series C Funding to Advance AI Inference Technology

TLDR: d-Matrix, a pioneer in generative AI inference compute for data centers, has successfully closed a $275 million Series C funding round, valuing the company at $2 billion. This capital injection, bringing the total funding to $450 million, will fuel the company’s global expansion, product roadmap, and large-scale deployments of its high-performance, energy-efficient AI inference platform.

SANTA CLARA, Calif. – November 12, 2025 – d-Matrix, a leading innovator in generative AI inference compute solutions for data centers, today announced the completion of its Series C funding round, raising an impressive $275 million. This oversubscribed round elevates the company’s valuation to $2 billion and brings its total funding to $450 million.

The substantial new capital is earmarked to accelerate d-Matrix’s product development roadmap, expand its global footprint, and support the deployment of its advanced data center inference platform for a diverse clientele, including hyperscale, enterprise, and sovereign customers. The funding round saw significant participation from prominent investment firms across Europe, North America, Asia, and the Middle East.

The Series C round was co-led by a global consortium comprising BullhoundCapital, Triatomic Capital, and Temasek. New investors in this round include the Qatar Investment Authority (QIA) and EDBI, with continued support from existing investors such as M12 (Microsoft’s Venture Fund), Nautilus Venture Partners, Industry Ventures, and Mirae Asset.

d-Matrix’s core offering is a full-stack inference platform that integrates groundbreaking compute-memory technology, high-speed networking, and specialized inference-optimized software. This innovative approach is designed to deliver a significant performance leap, boasting 10 times faster performance, 3 times lower cost, and 3–5 times better energy efficiency compared to traditional GPU-based systems.

Key products in their portfolio include Corsairâ„¢ inference accelerators, JetStreamâ„¢ NICs (networking accelerators), and the Aviatorâ„¢ software stack. These solutions enable the processing of up to 30,000 tokens per second at a mere 2 milliseconds per token on a Llama 70B model. The compute-dense architecture allows for the rapid execution of models with up to 100 billion parameters within a single rack, directly addressing the escalating challenges of AI sustainability by enabling a single data center to manage the workload typically requiring ten.

Sid Sheth, CEO and co-founder of d-Matrix, emphasized the company’s long-standing vision: “From day one, d-Matrix has been uniquely focused on inference. When we started d-Matrix six years ago, training was seen as AI’s biggest challenge, but we knew that a new set of challenges would be coming soon. We predicted that when trained models needed to run continuously at scale, the infrastructure wouldn’t be ready. We’ve spent the last six years building the solution: a fundamentally new architecture that enables AI to operate everywhere, all the time. This funding validates that vision as the industry enters the Age of AI Inference.”

Investor confidence is underpinned by d-Matrix’s distinctive technology, rapid customer acquisition, and a growing network of global partnerships. This includes the recently unveiled d-Matrix SquadRackâ„¢ – an open standards-based reference architecture developed in collaboration with industry leaders Arista, Broadcom, and Supermicro. The company’s robust product roadmap, featuring advancements like 3D memory-stacking innovations, and a customer-centric market strategy further solidify its position as a crucial component of the evolving AI infrastructure.

Per Roman, Founder of BullhoundCapital, commented, “As the AI industry’s focus shifts from training to large-scale inference, the winners will be those who anticipated this transition early and built for it. d-Matrix stands out not only for its technical depth but for its clear strategic vision. The team understood before anyone else that inference would define the economics of AI — and they’re executing brilliantly on that insight.”

Jeff Huber, General Partner at Triatomic Capital, added, “AI inference is becoming the dominant cost in production AI systems, and d-Matrix has cracked the code on delivering both performance and sustainable economics at scale. Their digital in-memory compute architecture is purpose-built for low-latency, high-throughput inference workloads that matter most. With Sid, Sudeep, and their world-class team, plus an exceptional ecosystem of partners, d-Matrix is redefining what’s economically possible in AI infrastructure.”

Michael Stewart, Managing Partner at M12, Microsoft’s Venture Fund, highlighted, “The explosion in AI inference demand shows us that efficiency and scalability can be key contributors to revenue capture and profitability for hyperscalers and AI factories. d-Matrix is the first AI chip startup to address contemporary unit economics in LLM inference for models of a range of sizes that are growing the fastest, with differentiated elements in the in-memory product architecture that will sustain the TCO benefits with leading latency and throughput.”

Morgan Stanley acted as the exclusive placement agent for the funding round, with Wilson Sonsini Goodrich & Rosati providing legal counsel to d-Matrix.

Key Facts about d-Matrix:

Founded: 2019

Headquarters: Santa Clara, CA

Global Offices: Toronto (Canada), Sydney (Australia), Bangalore (India), Belgrade (Serbia)

Founders: Sid Sheth (CEO), Sudeep Bhoja (CTO)

Core Products: Corsairâ„¢ inference accelerators, JetStreamâ„¢ networking accelerators, Aviatorâ„¢ software stack

Employees: 250+ worldwide

Series C Funding: $275 million

Total Funding to Date: $450 million

Also Read:

Valuation: $2 billion

Nikhil Patel
Nikhil Patelhttps://blogs.edgentiq.com
Nikhil Patel is a tech analyst and AI news reporter who brings a practitioner's perspective to every article. With prior experience working at an AI startup, he decodes the business mechanics behind product innovations, funding trends, and partnerships in the GenAI space. Nikhil's insights are sharp, forward-looking, and trusted by insiders and newcomers alike. You can reach him out at: [email protected]

- Advertisement -

spot_img

Gen AI News and Updates

spot_img

- Advertisement -