Listed and Halted, Surge Over 108% in a Single Day, Is Cerebras Really the 'Next Nvidia'?

Odaily星球日报Опубліковано о 2026-05-15Востаннє оновлено о 2026-05-15

Анотація

Cerebras Systems (CBRS), labeled the "next Nvidia," debuted on the NASDAQ on May 14th, 2025. Its stock price surged over 108% from its $185 IPO price, briefly touching $385 before settling around $311. CEO Andrew Feldman claimed the company's wafer-scale AI chips are "58 times larger and 15-20 times faster" than competitors like Nvidia. The company's core innovation is the Wafer Scale Engine (WSE), a massive, dinner-plate-sized chip designed to avoid the bottlenecks of interconnecting multiple GPUs. Its latest system, the CS-3, offers high-performance computing for AI training and inference. While still a niche player with $5.1 billion in 2025 revenue, Cerebras has secured major contracts, most notably a multi-year, over $20 billion computing deal with OpenAI. This partnership is deep: OpenAI is a major customer, a creditor via a $1 billion loan, and holds warrants that could make it a 10-11% shareholder in Cerebras. Despite the hype, the article argues Cerebras is unlikely to dethrone Nvidia soon. Nvidia's ecosystem (CUDA), vast scale, manufacturing efficiency, and diversified product line present a formidable moat. Cerebras faces high costs, production challenges with its giant chips, and competition from AMD, Google, and others. However, strong demand for AI inference and its key partnerships could support its stock price in the short to medium term. In conclusion, Cerebras is positioned as a high-speed specialist in the AI hardware market, not a broad-market replacement...

Original|Odaily Planet Daily(@OdailyChina)

Author|Wenser(@wenser 2010)

Last night, Cerebras (CBRS), dubbed the 'next Nvidia,' officially began trading. Shortly after opening at the $185 issue price, it surged to $350, peaking at $385 intraday with a gain of over 108%. Although the stock has now retraced to around $311, it still maintains a gain of over 68%. Previously, Cerebras CEO Andrew Feldman stated in an interview with CNBC: "Our chip is the size of a dinner plate and is 20 times faster than Nvidia's chips."

What gives this chipmaker, which raised $5.5 billion, the confidence to make such bold claims about being 'faster than Nvidia's chips'? How did it secure a $20 billion order from OpenAI amidst fierce competition? Will its stock price continue its upward trend in the short term? Odaily Planet Daily will provide its own answers to these questions in this article.

Cerebras' Basis for Challenging Nvidia: Opening a New World of AI with Wafer-Scale Chips

As the gap in AI computing power grows increasingly vast, robust market demand has propelled Nvidia to become the world's highest-valued listed company.

Recently, Nvidia's stock price hit new highs, with its market capitalization once exceeding $5.5 trillion. In terms of its scale, it has become an economic entity second only to the GDPs of the US and China, far surpassing major global economies like Germany and Japan, truly deserving the title of 'richer than many countries.'

However, unlike the decades-old 'veteran champion' Nvidia, Cerebras (CBRS) is a newcomer in chip manufacturing.

In 2016, Andrew Feldman, Gary Lauterbach, Sean Lie, Michael James, JP Fricker, and other semiconductor industry veterans co-founded Cerebras Systems, headquartered in Sunnyvale, California. Unlike Nvidia's focus on building general-purpose GPUs to maximize market demand, Cerebras' core innovation is the Wafer Scale Engine (WSE), currently the world's largest AI chip.

The founding team of Cerebras (2022)

Its core products include:

  • WSE-3: Area approximately 46,225 mm² (equivalent to a dinner plate size), containing 4 trillion transistors, 900,000 AI-optimized cores, delivering 125 petaflops of computing power. Compared to traditional GPUs, it turns an entire wafer into a single giant processor, avoiding bottlenecks from multi-GPU interconnects, with on-chip SRAM as high as 44GB and extremely high memory bandwidth.
  • CS-3 System: AI supercomputer based on WSE-3, supporting both training and inference; Currently, Cerebras not only sells chips but also provides cloud services (Cerebras Inference), dedicated data centers, and on-premises deployment technical support.

In terms of business model, Cerebras primarily provides ultra-low-latency inference for OpenAI, Meta, Perplexity, Mistral, GSK, Mayo Clinic, etc. In 2025, Cerebras generated annual revenue of $510 million (a 76% year-over-year increase), is already profitable, and has massive order backing (including a multi-year, hundreds-of-megawatt computing contract with OpenAI).

Illustration of Cerebras WSE-3 Chip

On May 14th, IPO day, Cerebras CEO Andrew Feldman responded positively on CNBC's 'Squawk Box' regarding the company's operational status, technological moat, and future market direction:

  • First, Feldman stated that the IPO was "the right way to fund our growth," the company is mature, and public markets can support huge growth opportunities. He emphasized this was the result of a decade of effort, expressed great pride, and said the market "understood our story and responded positively."
  • Second, he repeatedly stressed that Cerebras is the only company to have successfully built a 'giant chip' in 70 years, with all other attempts having failed, thus "the technical moat is wide and deep." It was here that he mentioned Cerebras' chips are 58 times larger than competitors like Nvidia and run 15-20 times faster, significantly accelerating AI inference and training.
  • Finally, addressing market concerns about the sustainability of AI spending, Feldman stated the demand is "massive and growing." The company's chips qualitatively change the AI experience (faster response, real-time agents, etc.). He mentioned important collaborations with OpenAI, AWS, etc., and expressed optimism about the overall AI hardware environment.

On a side note, similar to Musk's earlier bet with Anthropic on 'space data centers' (recommended reading: 'Musk and Anthropic Are Going to Space for Power'), Feldman also boldly predicted, "Within 15 years, data centers in space are highly likely to become a reality," showcasing his confidence in the long-term construction and rapid expansion of AI infrastructure.

Thus, as a 'speed demon' in the AI chip field, Cerebras has successfully broken through by focusing on extreme performance for super-large-scale models, emerging as a strong challenger to Nvidia in areas like large model inference and super-large-scale training applications.

In this regard, OpenAI's $20 billion order provides ample confidence for its development, and the collaboration between the two goes far beyond the simple relationship of 'chip manufacturer' and 'chip buyer.'

The Complex Relationship Between Cerebras and OpenAI: Customer, Creditor, and Potential Major Shareholder

The origins between Cerebras and OpenAI can be described as long-standing. Beyond company-level collaboration, OpenAI founders Sam Altman, co-founder Greg Brockman, and others were early angel investors in Cerebras, holding small stakes. This is perhaps a key reason for the deep, multi-faceted ties between the two companies today.

In December 2025, OpenAI provided Cerebras with a $1 billion Working Capital Loan, establishing a creditor-debtor relationship between them.

In January of this year, Cerebras and OpenAI's '750MW inference computing power procurement agreement' was officially announced, with subsequent emphasis on an option to expand this to 2GW. This was confirmed again in April. According to media reports, OpenAI plans to invest over $20 billion in the next three years to purchase servers powered by Cerebras chips and will acquire equity in the company as part of the deal. OpenAI thus became Cerebras' largest customer, bar none.

Image Source: @Xingpt

Subsequent S-1 filings and IPO application documents from Cerebras indicate that OpenAI is expected to obtain approximately 33.44 million Cerebras warrants at an extremely low exercise price of $0.00001 per share. Some warrants have vesting conditions, including compute power delivery dates and milestone requirements such as Cerebras' market cap exceeding $40 billion.

If all are exercised and conditions met, OpenAI could acquire about 10%-11% equity (specific percentage depends on post-IPO total shares). Based on the IPO valuation of around $56 billion, this equity would be worth approximately $5-6 billion; based on the current market cap (close to $95 billion after the first day of trading), this equity is now worth over $10.3 billion. Although not fully exercised yet, calling OpenAI a 'potential major shareholder of Cerebras' is already beyond doubt.

Image Source: @Xingpt

Whether Cerebras Can Become the Next Nvidia Remains Unknown, but Its Stock Price May Continue to Rise Short-Term

Returning to the initial third question: Can Cerebras become the next Nvidia?

From an industry landscape perspective, the answer is undoubtedly no. There are four main reasons:

  • First, a huge ecosystem gap: As the absolute hegemon in chip manufacturing, Nvidia's CUDA software stack is the undisputed industry standard, with countless developers, technical frameworks, and toolchains built upon it. While Cerebras has its own software stack, it is far from matching CUDA's maturity and compatibility, making the switching cost extremely high for many developers and enterprises.
  • Second, scale difference and diversified development paths: In 2025, Nvidia's revenue reached tens of billions of dollars, with its GPUs covering training, inference, graphics, automotive, data centers, and all scenarios. Jensen Huang even boldly proclaimed at CES 2026, "The AI chip and infrastructure market size could reach $1 trillion by 2027," with Nvidia being the largest beneficiary. In contrast, Cerebras' 2025 revenue was only $510 million, and its customer base is relatively concentrated with giants like OpenAI, making it less resilient to risks.
  • Third, differences in chip manufacturing and cost control: The super-large AI chip brings not only faster speeds but also higher manufacturing difficulty and costs. Cerebras' wafer-scale chip requires an entire wafer per chip, with TSMC facing low output, yield challenges, and high unit cost (a single CS-3 system costs far more than a single GPU). For Nvidia, dozens of GPUs can be cut from a single wafer, offering stronger economies of scale and higher economic returns.
  • Fourth, different competitive pressures in the chip industry: Unlike Nvidia's advantageous position, Cerebras faces direct competition from multiple industry players like Groq, AMD, Google TPU, and AWS Trainium. Although its current development momentum is strong, limited by time, funding, and resources, its current positioning is more like a "high-end niche player" rather than a "market dominator."

Based on the above, Cerebras cannot grow into an industry giant like Nvidia in the short term, nor can it disrupt the existing industry competition landscape. However, in terms of stock price comparison, its single-share price has already surpassed Nvidia's. Furthermore, thanks to the booming AI frenzy and the increasing computing power gap, with OpenAI and Anthropic yet to go public this year, Cerebras' stock price and market cap may still have some room for upward movement.

In the next 2-3 years, if it can successfully convert orders from OpenAI, AWS, etc., into actual revenue as scheduled, Cerebras' stock price may further explore higher levels; however, if order performance falls short of market expectations or AI model inference demand changes, its stock price will face significant downward pressure.

In summary, within 1-3 years, Cerebras cannot replace Nvidia, but it can occupy a certain share in the AI infrastructure niche market, becoming the 'king of AI chip speed.' As for the longer-term industry competition landscape, more time is needed for verification.

Recommended Reading

A Decade's Bet on Cerebras: How 'Wafer-Scale AI Chips' Reached Nasdaq

Cerebras AI Chip Breaks Nvidia Monopoly and Stands Out: In-Depth 10,000-Word Analysis of Cerebras Technology Design

Пов'язані питання

QWhat is the core innovation and main product that Cerebras Systems is built upon?

ACerebras Systems' core innovation is the Wafer Scale Engine (WSE), which is the world's largest AI chip. Their main product based on this is the WSE-3, a single massive processor the size of a dinner plate containing 4 trillion transistors and 900,000 AI-optimized cores, offering 125 petaflops of computing power. They also offer the CS-3 system, an AI supercomputer built around the WSE-3.

QAccording to the article, what is the multifaceted relationship between Cerebras and OpenAI beyond being a simple supplier-customer?

AThe relationship between Cerebras and OpenAI is complex and multifaceted. Beyond being a major customer with a multi-billion dollar chip supply agreement, OpenAI was an early angel investor in Cerebras. OpenAI also provided Cerebras with a $1 billion working capital loan, establishing a creditor-debtor relationship. Furthermore, as part of their supply deal, OpenAI holds warrants that could allow it to acquire a significant stake (potentially 10-11%) in Cerebras, making it a potential major shareholder.

QWhy does the article argue that Cerebras cannot become the next NVIDIA in the short term, listing at least two key reasons?

AThe article argues Cerebras cannot become the next NVIDIA in the short term for several reasons. First, there is a massive ecosystem gap: NVIDIA's CUDA software stack is the industry standard with unparalleled developer adoption, while Cerebras' software is less mature. Second, there is a huge scale and diversification difference: NVIDIA has revenues in the tens of billions covering a wide range of applications, whereas Cerebras' revenue is much smaller and more concentrated on a few large customers like OpenAI.

QWhat were the key performance highlights of Cerebras' (CBRS) IPO on its first day of trading?

AOn its first day of trading (IPO), Cerebras (CBRS) opened at an issue price of $185. Its stock price quickly surged, reaching a peak of $385, which represented an increase of over 108%. Although it later settled, it closed with a gain of over 68%, at approximately $311 per share.

QWhat did Cerebras CEO Andrew Feldman claim about the performance of their chip compared to competitors during his CNBC interview on IPO day?

ADuring his CNBC interview on IPO day, Cerebras CEO Andrew Feldman claimed that the company's chip is 58 times larger than competitors' chips (like NVIDIA's) and runs 15 to 20 times faster, significantly accelerating AI inference and training tasks.

Пов'язані матеріали

Winter for Crypto IPOs: Consensys and Ledger Withdraw Applications

The crypto IPO window is tightening significantly in 2026, marked by prominent companies delaying or pausing their public listing plans. Following a successful 2025 "harvest year" that saw Circle, Bullish, and Gemini go public amidst a bull market, the tide has turned. Consensys, developer of MetaMask, recently postponed its IPO until at least fall 2026. Hardware wallet leader Ledger also suspended its planned US listing due to unfavorable market conditions, with Kraken having previously delayed its own plans. This shift is driven by a cooling market in 2026, characterized by a significant Bitcoin price correction, declining trading volumes, and reduced investor risk appetite for crypto stocks. The poor post-IPO performance of 2025 listings like Circle and Bullish, which saw major share price declines, has heightened investor caution. This contrasts sharply with the current AI sector, where companies like SpaceX, OpenAI, and Anthropic are commanding massive valuations and investor enthusiasm based on narratives of stable, exponential growth. Crypto companies now face pressure to transition from hype-driven models to demonstrating reliable cash flows and robust compliance. While the paused IPO plans may lead to valuation resets and affect ecosystem liquidity, they also accelerate industry consolidation toward stronger, more compliant infrastructure players. A potential recovery in Bitcoin's price and clearer regulations could reopen the IPO window in the latter half of 2026.

marsbit54 хв тому

Winter for Crypto IPOs: Consensys and Ledger Withdraw Applications

marsbit54 хв тому

ChatGPT Can Manage Your Money for You. Would You Trust It with Your Bank Account?

OpenAI has launched a personal finance tool for ChatGPT, currently in preview for US-based ChatGPT Pro users. This feature allows users to connect their bank and investment accounts (via Plaid, supporting over 12,000 institutions) directly to ChatGPT. It analyzes transactions, generates visual dashboards, and offers conversational financial advice—such as budgeting or planning for major purchases—based on the user's actual data. This move follows OpenAI's acquisitions of fintech startups Roi and Hiro Finance, signaling a strategic push into vertical "super assistant" applications, similar to its earlier health-focused feature. However, the launch has sparked significant privacy concerns. Critics question the safety of granting such sensitive financial access to an AI, especially amid ongoing lawsuits alleging OpenAI shared user chat data with third parties like Meta and Google. OpenAI emphasizes that ChatGPT only reads data (no transaction capabilities), deletes it within 30 days if disconnected, and offers opt-out options for model training. Yet, trust remains a major hurdle. The trend reflects a broader industry shift: AI companies like Anthropic and Perplexity are also targeting high-value, data-rich domains like finance and health. While technically promising, the tool operates in a regulatory gray area—it provides personalized guidance but disclaims formal financial advice or liability. Ultimately, OpenAI's challenge is convincing users to trust an AI with their most private financial information.

marsbit54 хв тому

ChatGPT Can Manage Your Money for You. Would You Trust It with Your Bank Account?

marsbit54 хв тому

Торгівля

Спот
Ф'ючерси
活动图片