Original|Odaily Planet Daily(@OdailyChina)
Author|Wenser(@wenser 2010)
Last night, Cerebras (CBRS), dubbed the 'next Nvidia,' officially began trading. Shortly after opening at the $185 issue price, it surged to $350, peaking at $385 intraday with a gain of over 108%. Although the stock has now retraced to around $311, it still maintains a gain of over 68%. Previously, Cerebras CEO Andrew Feldman stated in an interview with CNBC: "Our chip is the size of a dinner plate and is 20 times faster than Nvidia's chips."
What gives this chipmaker, which raised $5.5 billion, the confidence to make such bold claims about being 'faster than Nvidia's chips'? How did it secure a $20 billion order from OpenAI amidst fierce competition? Will its stock price continue its upward trend in the short term? Odaily Planet Daily will provide its own answers to these questions in this article.
Cerebras' Basis for Challenging Nvidia: Opening a New World of AI with Wafer-Scale Chips
As the gap in AI computing power grows increasingly vast, robust market demand has propelled Nvidia to become the world's highest-valued listed company.
Recently, Nvidia's stock price hit new highs, with its market capitalization once exceeding $5.5 trillion. In terms of its scale, it has become an economic entity second only to the GDPs of the US and China, far surpassing major global economies like Germany and Japan, truly deserving the title of 'richer than many countries.'
However, unlike the decades-old 'veteran champion' Nvidia, Cerebras (CBRS) is a newcomer in chip manufacturing.
In 2016, Andrew Feldman, Gary Lauterbach, Sean Lie, Michael James, JP Fricker, and other semiconductor industry veterans co-founded Cerebras Systems, headquartered in Sunnyvale, California. Unlike Nvidia's focus on building general-purpose GPUs to maximize market demand, Cerebras' core innovation is the Wafer Scale Engine (WSE), currently the world's largest AI chip.
The founding team of Cerebras (2022)
Its core products include:
- WSE-3: Area approximately 46,225 mm² (equivalent to a dinner plate size), containing 4 trillion transistors, 900,000 AI-optimized cores, delivering 125 petaflops of computing power. Compared to traditional GPUs, it turns an entire wafer into a single giant processor, avoiding bottlenecks from multi-GPU interconnects, with on-chip SRAM as high as 44GB and extremely high memory bandwidth.
- CS-3 System: AI supercomputer based on WSE-3, supporting both training and inference; Currently, Cerebras not only sells chips but also provides cloud services (Cerebras Inference), dedicated data centers, and on-premises deployment technical support.
In terms of business model, Cerebras primarily provides ultra-low-latency inference for OpenAI, Meta, Perplexity, Mistral, GSK, Mayo Clinic, etc. In 2025, Cerebras generated annual revenue of $510 million (a 76% year-over-year increase), is already profitable, and has massive order backing (including a multi-year, hundreds-of-megawatt computing contract with OpenAI).
Illustration of Cerebras WSE-3 Chip
On May 14th, IPO day, Cerebras CEO Andrew Feldman responded positively on CNBC's 'Squawk Box' regarding the company's operational status, technological moat, and future market direction:
- First, Feldman stated that the IPO was "the right way to fund our growth," the company is mature, and public markets can support huge growth opportunities. He emphasized this was the result of a decade of effort, expressed great pride, and said the market "understood our story and responded positively."
- Second, he repeatedly stressed that Cerebras is the only company to have successfully built a 'giant chip' in 70 years, with all other attempts having failed, thus "the technical moat is wide and deep." It was here that he mentioned Cerebras' chips are 58 times larger than competitors like Nvidia and run 15-20 times faster, significantly accelerating AI inference and training.
- Finally, addressing market concerns about the sustainability of AI spending, Feldman stated the demand is "massive and growing." The company's chips qualitatively change the AI experience (faster response, real-time agents, etc.). He mentioned important collaborations with OpenAI, AWS, etc., and expressed optimism about the overall AI hardware environment.
On a side note, similar to Musk's earlier bet with Anthropic on 'space data centers' (recommended reading: 'Musk and Anthropic Are Going to Space for Power'), Feldman also boldly predicted, "Within 15 years, data centers in space are highly likely to become a reality," showcasing his confidence in the long-term construction and rapid expansion of AI infrastructure.
Thus, as a 'speed demon' in the AI chip field, Cerebras has successfully broken through by focusing on extreme performance for super-large-scale models, emerging as a strong challenger to Nvidia in areas like large model inference and super-large-scale training applications.
In this regard, OpenAI's $20 billion order provides ample confidence for its development, and the collaboration between the two goes far beyond the simple relationship of 'chip manufacturer' and 'chip buyer.'
The Complex Relationship Between Cerebras and OpenAI: Customer, Creditor, and Potential Major Shareholder
The origins between Cerebras and OpenAI can be described as long-standing. Beyond company-level collaboration, OpenAI founders Sam Altman, co-founder Greg Brockman, and others were early angel investors in Cerebras, holding small stakes. This is perhaps a key reason for the deep, multi-faceted ties between the two companies today.
In December 2025, OpenAI provided Cerebras with a $1 billion Working Capital Loan, establishing a creditor-debtor relationship between them.
In January of this year, Cerebras and OpenAI's '750MW inference computing power procurement agreement' was officially announced, with subsequent emphasis on an option to expand this to 2GW. This was confirmed again in April. According to media reports, OpenAI plans to invest over $20 billion in the next three years to purchase servers powered by Cerebras chips and will acquire equity in the company as part of the deal. OpenAI thus became Cerebras' largest customer, bar none.
Image Source: @Xingpt
Subsequent S-1 filings and IPO application documents from Cerebras indicate that OpenAI is expected to obtain approximately 33.44 million Cerebras warrants at an extremely low exercise price of $0.00001 per share. Some warrants have vesting conditions, including compute power delivery dates and milestone requirements such as Cerebras' market cap exceeding $40 billion.
If all are exercised and conditions met, OpenAI could acquire about 10%-11% equity (specific percentage depends on post-IPO total shares). Based on the IPO valuation of around $56 billion, this equity would be worth approximately $5-6 billion; based on the current market cap (close to $95 billion after the first day of trading), this equity is now worth over $10.3 billion. Although not fully exercised yet, calling OpenAI a 'potential major shareholder of Cerebras' is already beyond doubt.
Image Source: @Xingpt
Whether Cerebras Can Become the Next Nvidia Remains Unknown, but Its Stock Price May Continue to Rise Short-Term
Returning to the initial third question: Can Cerebras become the next Nvidia?
From an industry landscape perspective, the answer is undoubtedly no. There are four main reasons:
- First, a huge ecosystem gap: As the absolute hegemon in chip manufacturing, Nvidia's CUDA software stack is the undisputed industry standard, with countless developers, technical frameworks, and toolchains built upon it. While Cerebras has its own software stack, it is far from matching CUDA's maturity and compatibility, making the switching cost extremely high for many developers and enterprises.
- Second, scale difference and diversified development paths: In 2025, Nvidia's revenue reached tens of billions of dollars, with its GPUs covering training, inference, graphics, automotive, data centers, and all scenarios. Jensen Huang even boldly proclaimed at CES 2026, "The AI chip and infrastructure market size could reach $1 trillion by 2027," with Nvidia being the largest beneficiary. In contrast, Cerebras' 2025 revenue was only $510 million, and its customer base is relatively concentrated with giants like OpenAI, making it less resilient to risks.
- Third, differences in chip manufacturing and cost control: The super-large AI chip brings not only faster speeds but also higher manufacturing difficulty and costs. Cerebras' wafer-scale chip requires an entire wafer per chip, with TSMC facing low output, yield challenges, and high unit cost (a single CS-3 system costs far more than a single GPU). For Nvidia, dozens of GPUs can be cut from a single wafer, offering stronger economies of scale and higher economic returns.
- Fourth, different competitive pressures in the chip industry: Unlike Nvidia's advantageous position, Cerebras faces direct competition from multiple industry players like Groq, AMD, Google TPU, and AWS Trainium. Although its current development momentum is strong, limited by time, funding, and resources, its current positioning is more like a "high-end niche player" rather than a "market dominator."
Based on the above, Cerebras cannot grow into an industry giant like Nvidia in the short term, nor can it disrupt the existing industry competition landscape. However, in terms of stock price comparison, its single-share price has already surpassed Nvidia's. Furthermore, thanks to the booming AI frenzy and the increasing computing power gap, with OpenAI and Anthropic yet to go public this year, Cerebras' stock price and market cap may still have some room for upward movement.
In the next 2-3 years, if it can successfully convert orders from OpenAI, AWS, etc., into actual revenue as scheduled, Cerebras' stock price may further explore higher levels; however, if order performance falls short of market expectations or AI model inference demand changes, its stock price will face significant downward pressure.
In summary, within 1-3 years, Cerebras cannot replace Nvidia, but it can occupy a certain share in the AI infrastructure niche market, becoming the 'king of AI chip speed.' As for the longer-term industry competition landscape, more time is needed for verification.
Recommended Reading
A Decade's Bet on Cerebras: How 'Wafer-Scale AI Chips' Reached Nasdaq
Cerebras AI Chip Breaks Nvidia Monopoly and Stands Out: In-Depth 10,000-Word Analysis of Cerebras Technology Design













