DeepSeek: A Game Changer for Smaller AI Firms
DeepSeek has made a significant impact on the U.S.-led AI landscape with the introduction of its latest model, which has led to a substantial decrease in the market capitalization of chip giant Nvidia. While larger companies in the sector deal with the implications of this development, smaller AI firms view it as an opportunity for growth alongside the new Chinese startup.
Numerous AI-related businesses have expressed to industry observers that DeepSeek's rise presents a "massive" opportunity rather than a challenge. According to Andrew Feldman, the CEO of AI chip startup Cerebras Systems, developers are eager to transition from the expensive and proprietary models of OpenAI to more accessible open-source options like DeepSeek R1.
DeepSeek's R1 model has caused a surge in demand for Cerebras’ cloud-based services, indicating that their solutions are well-aligned with current market needs. Feldman underlined that R1 demonstrates that the AI market will not be controlled by a single entity and that the idea of hardware and software barriers does not apply to open-source models.
The term "open-source" refers to software where the source code is freely available online, allowing for modifications and redistribution. Unlike their competitors like OpenAI, DeepSeek promotes the use of open-source models.
DeepSeek claims that its R1 reasoning model competes with leading American technology, achieving efficiency even without advanced graphic processing units. However, the validity of this claim is a topic of discussion among industry experts.
Feldman noted that just as falling prices spurred growth in the PC and internet markets, decreasing costs in AI will do the same. The sector is experiencing similar expansion trends.
Advancement in Inference Chips
Industry insiders and chip startups suggest that DeepSeek is likely to accelerate the adoption of innovative chip technologies by enhancing the AI cycle from training to inference. Inference involves using AI to make predictions based on new data, as opposed to developing the model itself.
Phelix Lee, an equity analyst at Morningstar focusing on semiconductors, explained that training AI is about creating algorithms, while inference is about applying them in real-world situations.
Despite Nvidia's stronghold on graphics processing units (GPUs) for AI training, many companies see opportunities to grow within the inference segment, where they can provide high efficiency at lower costs.
Several AI chip startups have indicated to industry observers that they are seeing greater demand for inference chips as clients adopt and expand upon DeepSeek's open-source model. Sid Sheth, CEO of AI chip startup d-Matrix, remarked that DeepSeek has shown that smaller open models can be as effective, if not more so, than larger proprietary models, all at a significantly lower cost.
With the availability of capable smaller models, the industry is entering what Sheth calls the "age of inference." He reported an increase in interest from global customers eager to enhance their inference capabilities.
Robert Wachen, co-founder and COO of AI chipmaker Etched, shared that several businesses have contacted them since DeepSeek released its reasoning models, leading many to shift their budgetary focus from training clusters to inference clusters.
Wachen emphasized that DeepSeek-R1 has validated that inference compute is now a leading method for all major model vendors, suggesting a growing need for computing power to scale models for widespread use.
The Implications of Jevon’s Paradox
Many analysts and industry experts concur that DeepSeek's advancements are beneficial for both AI inference and the broader AI chip sector. A report by Bain & Company stated that DeepSeek's efficacy stems from engineering innovations that lower inference costs and improve training efficiency.
They referenced Jevon's Paradox, a theory where technological cost reductions lead to increased demand for that technology.
The financial services company Wedbush has noted that it anticipates the broader use of AI across various enterprises and retail settings, leading to heightened demand.
In a recent segment on CNBC's "Fast Money," Sunny Madra, COO at Groq—another company focused on AI inference chips—remarked that as the need for AI grows, smaller firms will have more pathways for expansion. He pointed out that since Nvidia cannot supply chips to meet the global requirements, this gap enables smaller companies to enter the market more aggressively.
DeepSeek, AI, Chip, Innovation, Growth