d-Matrix Raises $110 Million to Take on NVIDIA in AI Inference Chip Market

d-Matrix Raises $110 Million to Take on NVIDIA

Microsoft’s venture group M12 is among the investors betting on d-Matrix’s innovative in-memory compute chips for AI inference workloads.

Silicon Valley startup d-Matrix has raised $110 million in a Series B funding round led by Temasek and Playground Global, with participation from Microsoft’s venture capital arm M12. The new capital will help d-Matrix commercialize its flagship Corsair chip for artificial intelligence inference workloads and take on NVIDIA’s dominance in the AI chip market.

What Exactly Does d-Matrix Do?

d-Matrix designs and manufactures specialty chips called Digital Memory In Compute (DIMC) platforms. These chips are optimized to run inference for large AI models like large language models. Inference refers to the process of taking a trained AI or LLM and using it to generate predictions or outputs. This happens after the computationally intensive training phase.

Inference still requires immense compute power. d-Matrix’s core innovation with Corsair and its other DIMC chips is keeping the AI model entirely within the chip’s internal memory. This allows for much faster and more efficient inference compared to shuttling data back and forth from separate memory banks.

Why Is the AI Inference Chip Market Heating Up Now?

We’ve seen an explosion in cutting-edge generative AI models like DALL-E 2, GPT-3, and ChatGPT over the past couple years. However, running these models is extremely expensive due to their massive size and compute requirements. Experts estimate training costs alone can run into the millions of dollars.

This poses a challenge for companies wanting to deploy these AIs commercially. Inference chips like d-Matrix’s Corsair could significantly reduce the operational costs and allow for wider adoption of large language models. DIMC can lower total cost of ownership by over 60% compared to GPUs, according to d-Matrix.

How Does d-Matrix Fit Into the Competitive AI Chip Industry?

The AI chip sector is dominated by NVIDIA, which makes lucrative GPUs used for both AI training and inference. Many promising startups have struggled to compete with NVIDIA’s market position and resources.

However, d-Matrix has carved out a unique niche with its focus on efficient inference. By avoiding competition with NVIDIA’s training chips, it has a better shot at gaining traction. The $110 million investment, especially Microsoft’s backing, validates d-Matrix’s differentiated approach.

The funding also comes amid a global chip shortage that has constrained supply of graphics cards. d-Matrix’s chiplets could help supplement availability of AI compute power. Its chips will be compatible with industry standard hardware and interfaces.

What’s Next for d-Matrix?

The company will use the new capital to build out manufacturing and launch Corsair commercially in 2024. d-Matrix recently unveiled an early access program for Corsair. It already has produced earlier generations of its DIMC chips like Nighthawk, Jayhawk-I, and Jayhawk-II.

With fresh funding and growing market demand, d-Matrix is poised to make a real impact providing specialized AI inference solutions. Its success could also inspire more innovation in the AI chip sector beyond NVIDIA. Advances like d-Matrix’s will be crucial for making large language models economically viable and driving the next generation of AI applications.