šŸ”’ FT – Chip challengers try to break Nvidiaā€™s grip on AI market

Companies like Cerebras, d-Matrix, and Groq are vying to challenge Nvidia’s dominance in the AI chip market. They focus on specialized, cost-effective chips for AI inference, aiming to capitalize on the surging demand for AI technology. Despite significant investments, breaking Nvidiaā€™s hold remains challenging for these startups.

Sign up for your early morning brew of the BizNews Insider to keep you up to speed with the content that matters. The newsletter will land in your inbox at 5:30am weekdays. Register here.

By Michael Acton and George Hammond*

Companies such as Cerebras, d-Matrix and Groq are focusing on cheaper, more specialised products ___STEADY_PAYWALL___

Nvidiaā€™s rivals are mobilising in an effort to break the companyā€™s stranglehold on the AI chip market, raising hundreds of millions of dollars and rolling out new products as they look to share the spoils of a boom in artificial intelligence technology.

Cerebras, d-Matrix and Groq are among a group of smaller companies aiming to take a slice of the multibillion-dollar AI chip market from Nvidia, which has so far dominated the first wave of investment with its graphics processing units, or GPUs.

They are riding a wave of expectation that demand for artificial intelligence ā€œinferenceā€ ā€” the compute power needed for models such as OpenAIā€™s ChatGPT and Googleā€™s Gemini to generate responses to queries ā€” will grow exponentially as chatbots and other generative AI applications become more popular.

Nvidiaā€™s Hopper GPUs, which are well suited to the highly resource-intensive task of training top AI models, have become one of the worldā€™s hottest commodities.

Read more: Nvidiaā€™s wealth surge comes with a high-stress work culture

Cerebras, d-Matrix and Groq are focusing instead on cheaper, more specialised chips designed for running AI models.

On Tuesday Cerebras announced its new ā€œCerebras Inferenceā€ platform, based on its CS-3 chip, which is the size of a dinner plate. Cerebras claims its solution is 20 times faster than Nvidiaā€™s current generation of Hopper chips at AI inference, at a fraction of the price. Cerebras cites tests run by benchmarking analysis provider Artificial Analysis.

ā€œThe way you beat the 800lb gorilla is by bringing a vastly better product to market,ā€ Cerebras chief executive Andrew Feldman told the Financial Times. ā€œIn my experience, better products usually win, and weā€™ve taken meaningful customers from [Nvidia].ā€

The CS-3 chip shuns the use of a separate high-bandwidth memory chip, which is used by Nvidia. Instead it offers an alternative architecture with memory built directly into the chip wafer.

Limitations on memory bandwidth, Feldman said, are a fundamental constraint on the inference speed of an AI chip. The combination of logic and memory into a single large chip delivers results that are ā€œorders of magnitude fasterā€, he said.

d-Matrix, founded by Sid Sheth in 2019, is also kicking off a new funding round less than a year after it raised $110mn in a series B funding round led by Singaporeā€™s state-owned fund Temasek. The company is aiming to raise $200mn or more later this year or early next, according to Sheth. d-Matrix is early in the fundraising process and said the ultimate figure raised could change.

d-Matrix is planning a full-scale launch of its own chip platform, Corsair, at the end of this year. Sheth said the company was pairing its products with open software such as Triton, which competes with Nvidiaā€™s Cuda, a widely used software platform that offers the tools for developers to build AI applications and optimises the performance of its chips.

Nvidiaā€™s biggest customers are backing the use of open software such as Triton. ā€œApp developers donā€™t like to be held to one particular tool,ā€ Sheth said, and ā€œpeople are getting wise that Nvidia has a stranglehold with Cuda on the training sideā€.

Read more: šŸ”’ As Nvidiaā€™s shares soar, portfolio managers face the growing risk of overconcentration

Groq, another AI inference competitor led by a former founding member of Googleā€™s tensor processing unit team, raised $640mn this month from investors led by BlackRock Private Equity Partners, at a valuation of $2.8bn.

One venture capitalist cautioned that despite the hype around the sector, semiconductor start-ups had had a challenging time breaking into the market.

Chipmaker Graphcore was bought by SoftBank last month for just above $600mn, less than the roughly $700mn that the company had raised in venture capital since it was founded in 2016, according to people familiar with the deal.

Groq and Cerebras were also founded in 2016. ā€œThere has been a near insatiable desire from public investors to find and back the next Nvidia,ā€ said Peter HĆ©bert, co-founder and managing partner at venture firm Lux Capital. ā€œThis isnā€™t just about chasing the latest trend. The momentum is also benefiting several VC-funded chip start-ups that have been toiling away for nearly a decade.ā€

Read also:

Ā© 2024 The Financial Times Ltd.

GoHighLevel
gohighlevel gohighlevel login gohighlevel pricing gohighlevel crm gohighlevel api gohighlevel support gohighlevel review gohighlevel logo what is gohighlevel gohighlevel affiliate gohighlevel integrations gohighlevel features gohighlevel app gohighlevel reviews gohighlevel training gohighlevel snapshots gohighlevel zapier app gohighlevel gohighlevel alternatives gohighlevel price
ZS Digital OrbitDigital Marketing AgencyAffordable Web Development