Cerebras IPO: A $48.8 Billion Valuation—Is the 'Nvidia Challenger' a Bubble or a New King?

marsbitPublished on 2026-05-12Last updated on 2026-05-12

Abstract

Cerebras Systems, positioning itself as an NVIDIA challenger, is going public with a $48.8 billion valuation despite several underlying paradoxes revealed in its S-1 filing. While 2025 revenue grew 76% to $510M and GAAP net income was $237.8M, this profitability relies heavily on a one-time, non-cash accounting gain. Adjusting for this, the company's non-GAAP net loss actually widened to $75.7M. Furthermore, customer concentration remains extreme: 86% of 2025 revenue came from two Abu Dhabi-based entities, MBZUAI (62%) and G42 (24%). Its landmark deal with OpenAI, valued at over $20 billion, creates a complex, nested relationship where OpenAI is simultaneously a major customer, lender, warrant holder, and strategic partner with exclusivity clauses. Cerebras's technical edge in latency-sensitive AI inference is real, with its wafer-scale chip outperforming competitors in benchmarks. However, this advantage is confined to a specific niche, not the broader AI training market dominated by NVIDIA's CUDA ecosystem. With a 95x price-to-sales ratio, the valuation demands flawless execution of the OpenAI contract and massive future revenue growth. Key long-term risks include intense competition from giants like NVIDIA and AMD, a dual-class share structure granting insiders near-total voting control, and ongoing geopolitical uncertainties regarding export controls. The IPO is a pivotal capital markets event for AI infrastructure. As an investment, it represents a high-risk, high-re...

Written by: Xiaohei, Deep Tide TechFlow

Priced on May 13th, trading begins on May 14th, Nasdaq ticker CBRS.

This is the largest IPO globally in 2026 to date. The underwriting syndicate includes Morgan Stanley, Citi, Barclays, and UBS. With such a lineup, the offering was oversubscribed by 20 times during the roadshow, pushing the offer price from an initial $115-125 all the way to $150-160. The IPO is expected to raise $4.8 billion, implying a valuation of $48.8 billion.

Just three months ago, Cerebras's secondary market valuation was around $23 billion. That means, in the final stretch before the IPO, the company's book value more than doubled.

The story's 'selling points' have been repeated ten thousand times: the Nvidia challenger, wafer-scale chips, inference speeds 21x faster than the B200, a multi-billion-dollar compute contract with OpenAI starting at $10 billion and scaling up to $20 billion. It's a perfect 'AI challenger' script—technology narrative, geopolitical narrative, star customer, massive order—every component perfectly aligned with the main theme of AI infrastructure in 2026.

But reading the S-1 filing page by page reveals something odd: all the public reports tell one story, while the prospectus tells another.

The Triple Paradox

Breaking down the prospectus item by item, Cerebras presents a target constituted by a 'triple paradox'.

First: True technological Alpha, financial accounting magic.

The prospectus discloses: 2025 revenue of $510 million, up 76% year-over-year, GAAP net profit of $237.8 million. Sounds very impressive—a rapidly growing, already profitable AI hardware company, almost a 'mythical' target in the current valuation environment. CoreWeave was still losing money when it IPOed in March of this year; Cerebras is delivering a 47% net margin.

But of this $237.8 million 'net profit', $363.3 million comes from a one-time, non-cash accounting adjustment: a paper gain from the extinguishment of a forward contract liability related to G42. Excluding this item and adding back $49.8 million in stock-based compensation, the real non-GAAP net loss for 2025 was $75.7 million, a 247% deterioration from the $21.8 million loss in 2024.

In other words, what the market sees is a 'profitable + 76% growth' IPO golden child, while the prospectus discloses a 'rapidly growing company with widening losses'. Neither version is entirely wrong; the difference lies in which one the market chooses to believe.

Second: Superficially moved on from G42, but practically swapped in OpenAI's circular nesting.

The story of Cerebras's first failed IPO attempt in 2024 isn't complicated: G42, a UAE-background client, contributed 85% of revenue in the first half of the year. CFIUS initiated a review, forcing the company to withdraw its application.

Returning to the fray a year and a half later, the customer list appears diversified, adding heavyweight names like OpenAI and AWS. But turning to the S-1 from May 2026, the 2025 customer structure looks like this:

  • MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
  • G42: 24%
  • Combined: 86%

G42 simply ceded its 'weight' to MBZUAI, which is also located in the UAE and is a related party to G42. A single customer, MBZUAI, accounts for 77.9% of accounts receivable.

And OpenAI's so-called 'redemption story' is itself a nested structure. This contract is worth over $20 billion, with OpenAI committing to purchase 750 megawatts of compute power. But the same document also discloses several other things: OpenAI provided Cerebras with a $1 billion loan; OpenAI received 33 million nearly-free warrants for Cerebras stock; OpenAI's Master Relationship Agreement includes exclusivity clauses restricting Cerebras from selling to certain 'named competitors'.

That is, OpenAI is simultaneously Cerebras's customer, lender, soon-to-be shareholder, and, to some extent, strategic controller. An anonymous analyst commenting on a Medium analysis piece said something harsh: When revenue is circular, valuation is circular, and the IPO is to let those creating this revenue cash out, this isn't a market; it's financial engineering.

The wording might be overly sharp, but at the factual level, this is difficult to refute.

Third: Superficially Nvidia's 'challenger', but essentially Nvidia's 'narrow-band niche filler'.

This point is most easily overlooked by the market.

Cerebras's technology is indeed solid. The WSE-3 has 4 trillion transistors, 900,000 AI cores, 44GB of on-chip SRAM, turning an entire wafer into a single chip, bypassing the inter-chip communication bottlenecks all GPU clusters must face. Independent benchmarks from Artificial Analysis show that running Llama 4 Maverick (400 billion parameters), the CS-3 outputs 2500+ tokens per second per user, while Nvidia's flagship DGX B200 outputs about 1000 tokens, and Groq and SambaNova output 549 and 794 respectively.

The numbers don't lie, in the specific scenario of inference, Cerebras has a generational advantage over GPUs.

The keyword is 'inference'. Cerebras's own prospectus makes it clear: it excels at latency-sensitive inference workloads. For large model training and general-purpose computing, it neither has the capability nor the intention to challenge Nvidia. The CUDA ecosystem, accumulated over nearly 20 years since 2007—the toolchains, developer community, third-party libraries for model training—all of this remains within Nvidia's moat.

More crucially, the market isn't standing still. Nvidia's Vera Rubin architecture announced at GTC 2026 has 336 billion transistors, with performance claimed to be another 5x leap over Blackwell; AMD's MI400 has caught up to 320 billion transistors; Google TPU v6, Amazon Trainium 3, Microsoft Maia 2—hyperscale vendors are all making their own chips. Nvidia spent over $18 billion on R&D in FY2025, acquired AI inference startup Groq's assets for $20 billion last December, and invested another $4 billion in two photonics technology companies in March.

So a more accurate description is: Cerebras is not aiming to replace Nvidia; it is competing for a differentiated position within Nvidia's 'inference' narrow band. This is a real business, but a $48.8 billion valuation against $510 million in revenue implies a price-to-sales ratio of 95.

Andrew Feldman's Third Time 'Selling the Product'

Beyond the numbers, we need to talk about the soul of this company.

Andrew Feldman is an underrated 'serial entrepreneur' in Silicon Valley. He is not a technical genius founder, nor an academic from an ivory tower. He graduated from Stanford Business School, was VP of Marketing at Riverstone Networks (which IPOed in 2001), and VP of Product at Force10 Networks (which was sold to Dell for $800 million in 2011).

In 2007, he co-founded SeaMicro with Gary Lauterbach, focusing on 'energy-efficient servers', clustering many low-power processors with small cores to compete against the mainstream high-power servers with large cores. The idea was very forward-thinking, but the market was too early. In 2012, AMD bought SeaMicro for $334 million. Feldman worked as a VP at AMD for two years before leaving.

Then he started Cerebras.

Looking at Feldman's path as a whole reveals something interesting: he is not a 'chip designer'; he is an 'alternative bettor on compute infrastructure'. SeaMicro was a bet on 'small cores beating big cores'—he was half wrong. AMD bought it back then to use its Freedom Fabric interconnect technology for its own server CPU platform, but that path didn't pan out, and the SeaMicro brand later quietly faded away. Cerebras is a bet on 'big chips beating small chips', exactly the opposite proposition of SeaMicro.

In a sense, Feldman is doing the same thing: identifying the overlooked, seemingly 'impossible' paths in computing architecture that the mainstream ignores, placing a big bet, and then using strong salesmanship to push it to market. With SeaMicro, he managed to hold onto Force10's sales team; what AMD valued was his sales network. With Cerebras, the most crucial thing he did right was securing G42, enabling a hardware company that still had 80% of its 2024 revenue from a single Middle Eastern client to eventually sign a $20 billion contract with OpenAI.

The footnote to this story is: Feldman is a product-selling CEO, not a technology-visionary CEO. He excels at selling a 'crazy-sounding' product to clients willing to pay a premium for differentiation. That is his alpha.

Understanding this is important because it directly determines the judgment of Cerebras's investment value.

So, Is CBRS Worth Investing In?

Layering the three paradoxes above, the answer is actually more complex than a simple 'buy' or 'don't buy'.

If the goal is to catch the first-day IPO pop—with 20x oversubscription, the hottest sector in AI hardware, and a lack of pure-play Nvidia alternative listed stocks—CBRS is highly likely to surge on day one. This is an event-driven short-term trade, requiring little deep judgment.

But if making a 'long-term hold' investment judgment, three things must be considered first:

First, is Cerebras worth a 95x P/S ratio?

CoreWeave IPOed in March this year with a P/S ratio around 15x. Nvidia's current P/S ratio is about 25x. Pricing a company with $510 million in 2025 revenue, 86% customer concentration, and still losing money at the real operational level at 95x P/S implies the market requires it to grow revenue to $3-4 billion in the next three to four years while achieving sustained profitability.

Can this happen? It hinges on whether the OpenAI $20 billion contract lands as scheduled. According to the prospectus, about 15% of the remaining performance obligations will be recognized in 2026 and 2027, roughly $3.5 billion. At that pace, Cerebras's revenue could reach over $2 billion+ by 2027, potentially compressing the P/S ratio to a reasonable range. But any delay, any OpenAI strategic shift, any new major customer loss at any point would instantly make this valuation untenable.

Second, how wide is Cerebras's moat?

The architectural advantage of the WSE-3 is real, but how long will it last? Nvidia Vera Rubin, AMD MI400, and Google TPU v6 are all pushing forward. The generational replacement cycle in the chip industry is 18-24 months. If Cerebras slips by one cycle, its technical lead could be erased. While its R&D spending as a percentage of revenue is already high, the absolute amount remains orders of magnitude smaller compared to the giants.

The deeper question: Is the wafer-scale chip path a mainstream route that will be widely adopted, or a 'special forces' unit forever confined to niche scenarios? There's no definitive answer. The optimistic reply: When inference workloads grow from today's 30% to 70%+ of total AI compute, Cerebras's niche becomes the main battlefield. The pessimistic reply: As long as Nvidia boosts Rubin's inference performance, the niche will remain just a niche.

Third, Governance Structure and Geopolitical Risk

The prospectus discloses two easily overlooked but important things:

First, Cerebras employs a Class A/Class B dual-class share structure. Post-IPO, insiders will hold 99.2% of the voting rights. Even if the founding team's ownership of the float drops to 5% in the future, they still control the company. This means external minority shareholders have almost no say in corporate governance.

Second, the company discloses the existence of two 'material weaknesses in internal control over financial reporting'. As an emerging growth company, it can be exempt from SOX 404(b) auditor attestation for up to five years after the IPO. This is a red flag—not a massive one, but worth noting.

Geopolitically, CFIUS cleared the G42 voting rights issue this time, but export controls (permits for shipping CS-2, CS-3, CS-4 to the UAE) remain a long-term variable. The Trump administration's policy direction on AI chip exports to the Middle East is not yet fully stable. Any policy swing could reignite CBRS's tail risks.

Conclusion

For the CBRS IPO, as an event, it is the most noteworthy AI hardware capital event of 2026. It defines the valuation anchor for the AI infrastructure sector in the public markets. Its performance will influence the pricing of all related targets.

As a long-term holding, it is a typical 'high-potential, high-uncertainty' bet—wagering on the macro-narrative of 'inference is king' + the micro-execution of 'Cerebras leveraging OpenAI to achieve narrow-band dominance' + the valuation assumption that 'the market remains willing to pay a 95x P/S premium for AI hardware'. All three conditions must hold simultaneously for outsized returns; if any one breaks, the drawdown could be severe.

For institutional investors, the typical approach is not to chase on day one, but to wait for Q3 earnings reports, key customer progress updates, and valuation digestion. For retail investors, treating it as a small, tail-risk allocation within an AI hardware portfolio is acceptable; treating it as an all-in conviction stock requires re-reading the triple paradox above.

More worthy of attention than whether CBRS surges at tomorrow's open is the broader implication of this event: When a company with 86% of its revenue coming from two related UAE entities and still operating at a real loss can be priced by the market at $48.8 billion, this in itself tells everyone the extent of capital frenzy currently present in the AI infrastructure sector.

Related Questions

QWhat are the three paradoxes or contradictions highlighted in Cerebras' S-1 filing that contrast with its public market narrative?

AThe three paradoxes are: 1) Financial vs. Operational Performance: While GAAP reports a $237.8M profit, non-GAAP adjustments reveal a $75.7M net loss, showing underlying operational losses are worsening. 2) Customer Diversification vs. Concentration: Despite claims of diversification, 86% of 2025 revenue still came from two UAE-linked entities (MBZUAI and G42), and the OpenAI deal creates a complex, potentially circular financial relationship. 3) Market Positioning vs. Niche Reality: It is marketed as an 'Nvidia challenger' but its core strength is specifically in latency-sensitive inference workloads, not a broad replacement for Nvidia's dominant training and general-purpose CUDA ecosystem.

QAccording to the article, what is the core business strategy and strength of Cerebras CEO Andrew Feldman?

AAndrew Feldman's core strength is as a 'product-selling CEO,' not a technology visionary. He specializes in identifying unconventional paths in computing architecture (e.g., 'small cores' with SeaMicro, 'big chip' with Cerebras) and using exceptional salesmanship to sell these 'crazy-sounding' products to clients willing to pay a premium for differentiation. His key achievement with Cerebras was securing the foundational G42 deal and subsequently leveraging it to land the massive, complex contract with OpenAI.

QBased on the financial data, what is Cerebras' realistic valuation multiple (P/S ratio), and what future performance is implied by it?

ACerebras' IPO values it at $48.8 billion against $510 million in 2025 revenue, resulting in a Price-to-Sales (P/S) ratio of approximately 95x. This high multiple implies the market expects the company to grow revenue to around $30-$40 billion within the next 3-4 years while achieving sustained profitability. The realization of the $20+ billion OpenAI contract is central to meeting these growth expectations.

QWhat are the main long-term investment risks identified for Cerebras beyond its high valuation?

AKey long-term risks include: 1) Technological Moat Durability: The 18-24 month chip cycle means rivals like Nvidia (Vera Rubin), AMD, and cloud hyperscalers could catch up to its inference advantage. 2) Governance & Control: A dual-class share structure gives insiders 99.2% of voting post-IPO, limiting shareholder influence, and the company has disclosed material weaknesses in financial controls. 3) Geopolitical & Execution Risk: Ongoing export control uncertainties regarding shipments to the UAE and the heavy reliance on the timely and successful execution of the multi-billion-dollar OpenAI contract create significant potential for disruption.

QHow does the article characterize the significance of the Cerebras IPO for the broader AI infrastructure investment landscape?

AThe article characterizes the IPO as a defining event that sets a new 'valuation anchor' for the AI infrastructure sector in public markets. The fact that a company with 86% revenue concentration from the UAE and underlying operational losses can achieve a $48.8 billion valuation and a 95x P/S ratio demonstrates an extreme level of capital frenzy and speculative appetite within the AI hardware investment theme.

Related Reads

TechFlow Intelligence: Trump-Linked Companies Transfer $12 Million in Assets Before China Visit, 'The Big Short' Protagonist Warns of Stock Market Bubble Again

The article reports multiple developments across tech, crypto, and finance. In AI, Mozilla used AI for large-scale code review, Google confirmed hackers used AI to find zero-day exploits, and OpenAI deployed GPT-5.5 to find errors in math benchmarks. A court ruled Anthropic's scanning and destroying books for AI training as fair use, while its Claude platform launched on AWS. Google's new video model 'Omni' was leaked. In crypto/Web3, Trump-linked companies transferred $12M in crypto assets before a China visit. BlackRock chose Ethereum for tokenized funds, and a hacker stole $174k via a malicious NFT that tricked an AI. Jack Dorsey's first tweet NFT plummeted from $2.9M to under $5. In chips/hardware, TSMC approved an additional $20B for its Arizona plant. Apple's Tim Cook and Elon Musk will accompany Trump to China, while Nvidia's Jensen Huang is notably absent. For markets, Michael Burry warned of parabolic stock rises and suggested near-total sell-offs, with online discussions comparing current sentiment to the 1999 bubble. Other notes include WTI oil surpassing $100, a 20% price hike for Beijing-Shanghai high-speed rail, and new products like Unitree's $26.9k humanoid robot. The underlying theme suggests AI is becoming infrastructure, creating pressure on old systems while a new order is not yet ready, leaving investors anxious.

marsbit17m ago

TechFlow Intelligence: Trump-Linked Companies Transfer $12 Million in Assets Before China Visit, 'The Big Short' Protagonist Warns of Stock Market Bubble Again

marsbit17m ago

2026 New Policy Interpretation: The "Mutual Pursuit" of Intelligent Agents and AI Terminals, and the Three Major Value Reconstructions in the AIoT Industry

In May 2026, China's national ministries released two pivotal policy documents that jointly establish a strategic "dual-track" framework for the AIoT industry. The "Intelligent Agent Standardized Application and Innovation Development Implementation Opinions" defines the "soul"—positioning intelligent agents as core AI products. The "Artificial Intelligence Terminal Intelligence Grading" national standard defines the "body"—establishing a four-tier capability ladder (L1 to L4) for AI hardware. This synchronized policy approach is globally unique, moving beyond market-led (US) or risk-focused (EU) models. It frames AIoT as a new type of "intelligent infrastructure," comparable to electricity or the internet in historical significance. The core analysis identifies a value evolution from IoT 1.0 (connection) to AIoT 4.0 (collaboration, represented by the forward-looking L4 level). This "L4" signifies a paradigm shift: from users operating tools to delegating tasks to agent-like devices ("Intelligent Action of All Things"). The article outlines three strategic paths for companies: becoming Standard Definers, Scenario Integrators (focusing on 19 specified application areas), or Infrastructure Builders. A critical 18-24 month window is identified for strategic positioning. A "Four Levers" strategy is proposed: leveraging Standards (L-level certification), leveraging Scenarios (deep vertical focus), leveraging Open Source (for cost reduction and ecosystem influence), and leveraging Momentum (engaging in global protocol ecosystems). In conclusion, these policies are a starting gun for a decade-long industrial transformation, shifting the industry narrative from "Intelligent Connection of All Things" to "Intelligent Action of All Things," with companies needing to choose their赛道and execution strategy decisively.

marsbit1h ago

2026 New Policy Interpretation: The "Mutual Pursuit" of Intelligent Agents and AI Terminals, and the Three Major Value Reconstructions in the AIoT Industry

marsbit1h ago

Splashing Out 27 Billion Yuan, OpenAI Establishes New Company to Accelerate AI Deployment

On May 11th, OpenAI announced the formation of a new company, "OpenAI Deployment Company," with an initial investment of over $4 billion (approximately 27.2 billion RMB). This venture aims to help businesses build and deploy AI solutions. OpenAI is also acquiring the AI consulting firm Toromo to rapidly scale the deployment company's capabilities. This new entity, majority-owned by OpenAI, brings together 19 investment, consulting, and system integration partners, led by TPG with co-lead founding partners including Advent International, Bain Capital, and Brookfield. OpenAI's Chief Revenue Officer, Denise Dresser, stated that while AI is becoming increasingly capable, the current challenge lies in integrating these systems into core business infrastructure and workflows. The deployment company is designed to bridge this gap and translate AI capabilities into operational impact. This move comes as OpenAI emphasizes the next competitive phase will depend on the efficiency of deploying AI in real business scenarios. The company reports over 1 million businesses already use its products and APIs. OpenAI is significantly increasing its investments in computing power, with co-founder Greg Brockman stating the company expects to spend $50 billion on compute this year, a dramatic increase from $3 million in 2017. The announcement follows OpenAI's recent completion of a record $122 billion funding round in late March, led by Amazon, Nvidia, and SoftBank, valuing the company at $852 billion post-money. Major strategic investors committed $110 billion as a base for this round. Concurrently, OpenAI is advancing its core model development. It has shifted focus from its Sora video generator to developing advanced robotics and AI models that interact with the physical world. It has also begun allowing select users access to a new model specialized in identifying software vulnerabilities and is reportedly preparing to launch an enhanced image generation model in the coming weeks. According to reports citing founder Sam Altman, OpenAI is considering an IPO as early as 2027, with a potential valuation around $1 trillion.

marsbit1h ago

Splashing Out 27 Billion Yuan, OpenAI Establishes New Company to Accelerate AI Deployment

marsbit1h ago

Trading

Spot
Futures
活动图片