Written by: Xiaohei, Deep Tide TechFlow
Priced on May 13th, trading begins on May 14th, Nasdaq ticker CBRS.
This is the largest IPO globally in 2026 to date. The underwriting syndicate includes Morgan Stanley, Citi, Barclays, and UBS. With such a lineup, the offering was oversubscribed by 20 times during the roadshow, pushing the offer price from an initial $115-125 all the way to $150-160. The IPO is expected to raise $4.8 billion, implying a valuation of $48.8 billion.
Just three months ago, Cerebras's secondary market valuation was around $23 billion. That means, in the final stretch before the IPO, the company's book value more than doubled.
The story's 'selling points' have been repeated ten thousand times: the Nvidia challenger, wafer-scale chips, inference speeds 21x faster than the B200, a multi-billion-dollar compute contract with OpenAI starting at $10 billion and scaling up to $20 billion. It's a perfect 'AI challenger' script—technology narrative, geopolitical narrative, star customer, massive order—every component perfectly aligned with the main theme of AI infrastructure in 2026.
But reading the S-1 filing page by page reveals something odd: all the public reports tell one story, while the prospectus tells another.
The Triple Paradox
Breaking down the prospectus item by item, Cerebras presents a target constituted by a 'triple paradox'.
First: True technological Alpha, financial accounting magic.
The prospectus discloses: 2025 revenue of $510 million, up 76% year-over-year, GAAP net profit of $237.8 million. Sounds very impressive—a rapidly growing, already profitable AI hardware company, almost a 'mythical' target in the current valuation environment. CoreWeave was still losing money when it IPOed in March of this year; Cerebras is delivering a 47% net margin.
But of this $237.8 million 'net profit', $363.3 million comes from a one-time, non-cash accounting adjustment: a paper gain from the extinguishment of a forward contract liability related to G42. Excluding this item and adding back $49.8 million in stock-based compensation, the real non-GAAP net loss for 2025 was $75.7 million, a 247% deterioration from the $21.8 million loss in 2024.
In other words, what the market sees is a 'profitable + 76% growth' IPO golden child, while the prospectus discloses a 'rapidly growing company with widening losses'. Neither version is entirely wrong; the difference lies in which one the market chooses to believe.
Second: Superficially moved on from G42, but practically swapped in OpenAI's circular nesting.
The story of Cerebras's first failed IPO attempt in 2024 isn't complicated: G42, a UAE-background client, contributed 85% of revenue in the first half of the year. CFIUS initiated a review, forcing the company to withdraw its application.
Returning to the fray a year and a half later, the customer list appears diversified, adding heavyweight names like OpenAI and AWS. But turning to the S-1 from May 2026, the 2025 customer structure looks like this:
- MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
- G42: 24%
- Combined: 86%
G42 simply ceded its 'weight' to MBZUAI, which is also located in the UAE and is a related party to G42. A single customer, MBZUAI, accounts for 77.9% of accounts receivable.
And OpenAI's so-called 'redemption story' is itself a nested structure. This contract is worth over $20 billion, with OpenAI committing to purchase 750 megawatts of compute power. But the same document also discloses several other things: OpenAI provided Cerebras with a $1 billion loan; OpenAI received 33 million nearly-free warrants for Cerebras stock; OpenAI's Master Relationship Agreement includes exclusivity clauses restricting Cerebras from selling to certain 'named competitors'.
That is, OpenAI is simultaneously Cerebras's customer, lender, soon-to-be shareholder, and, to some extent, strategic controller. An anonymous analyst commenting on a Medium analysis piece said something harsh: When revenue is circular, valuation is circular, and the IPO is to let those creating this revenue cash out, this isn't a market; it's financial engineering.
The wording might be overly sharp, but at the factual level, this is difficult to refute.
Third: Superficially Nvidia's 'challenger', but essentially Nvidia's 'narrow-band niche filler'.
This point is most easily overlooked by the market.
Cerebras's technology is indeed solid. The WSE-3 has 4 trillion transistors, 900,000 AI cores, 44GB of on-chip SRAM, turning an entire wafer into a single chip, bypassing the inter-chip communication bottlenecks all GPU clusters must face. Independent benchmarks from Artificial Analysis show that running Llama 4 Maverick (400 billion parameters), the CS-3 outputs 2500+ tokens per second per user, while Nvidia's flagship DGX B200 outputs about 1000 tokens, and Groq and SambaNova output 549 and 794 respectively.
The numbers don't lie, in the specific scenario of inference, Cerebras has a generational advantage over GPUs.
The keyword is 'inference'. Cerebras's own prospectus makes it clear: it excels at latency-sensitive inference workloads. For large model training and general-purpose computing, it neither has the capability nor the intention to challenge Nvidia. The CUDA ecosystem, accumulated over nearly 20 years since 2007—the toolchains, developer community, third-party libraries for model training—all of this remains within Nvidia's moat.
More crucially, the market isn't standing still. Nvidia's Vera Rubin architecture announced at GTC 2026 has 336 billion transistors, with performance claimed to be another 5x leap over Blackwell; AMD's MI400 has caught up to 320 billion transistors; Google TPU v6, Amazon Trainium 3, Microsoft Maia 2—hyperscale vendors are all making their own chips. Nvidia spent over $18 billion on R&D in FY2025, acquired AI inference startup Groq's assets for $20 billion last December, and invested another $4 billion in two photonics technology companies in March.
So a more accurate description is: Cerebras is not aiming to replace Nvidia; it is competing for a differentiated position within Nvidia's 'inference' narrow band. This is a real business, but a $48.8 billion valuation against $510 million in revenue implies a price-to-sales ratio of 95.
Andrew Feldman's Third Time 'Selling the Product'
Beyond the numbers, we need to talk about the soul of this company.
Andrew Feldman is an underrated 'serial entrepreneur' in Silicon Valley. He is not a technical genius founder, nor an academic from an ivory tower. He graduated from Stanford Business School, was VP of Marketing at Riverstone Networks (which IPOed in 2001), and VP of Product at Force10 Networks (which was sold to Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach, focusing on 'energy-efficient servers', clustering many low-power processors with small cores to compete against the mainstream high-power servers with large cores. The idea was very forward-thinking, but the market was too early. In 2012, AMD bought SeaMicro for $334 million. Feldman worked as a VP at AMD for two years before leaving.
Then he started Cerebras.
Looking at Feldman's path as a whole reveals something interesting: he is not a 'chip designer'; he is an 'alternative bettor on compute infrastructure'. SeaMicro was a bet on 'small cores beating big cores'—he was half wrong. AMD bought it back then to use its Freedom Fabric interconnect technology for its own server CPU platform, but that path didn't pan out, and the SeaMicro brand later quietly faded away. Cerebras is a bet on 'big chips beating small chips', exactly the opposite proposition of SeaMicro.
In a sense, Feldman is doing the same thing: identifying the overlooked, seemingly 'impossible' paths in computing architecture that the mainstream ignores, placing a big bet, and then using strong salesmanship to push it to market. With SeaMicro, he managed to hold onto Force10's sales team; what AMD valued was his sales network. With Cerebras, the most crucial thing he did right was securing G42, enabling a hardware company that still had 80% of its 2024 revenue from a single Middle Eastern client to eventually sign a $20 billion contract with OpenAI.
The footnote to this story is: Feldman is a product-selling CEO, not a technology-visionary CEO. He excels at selling a 'crazy-sounding' product to clients willing to pay a premium for differentiation. That is his alpha.
Understanding this is important because it directly determines the judgment of Cerebras's investment value.
So, Is CBRS Worth Investing In?
Layering the three paradoxes above, the answer is actually more complex than a simple 'buy' or 'don't buy'.
If the goal is to catch the first-day IPO pop—with 20x oversubscription, the hottest sector in AI hardware, and a lack of pure-play Nvidia alternative listed stocks—CBRS is highly likely to surge on day one. This is an event-driven short-term trade, requiring little deep judgment.
But if making a 'long-term hold' investment judgment, three things must be considered first:
First, is Cerebras worth a 95x P/S ratio?
CoreWeave IPOed in March this year with a P/S ratio around 15x. Nvidia's current P/S ratio is about 25x. Pricing a company with $510 million in 2025 revenue, 86% customer concentration, and still losing money at the real operational level at 95x P/S implies the market requires it to grow revenue to $3-4 billion in the next three to four years while achieving sustained profitability.
Can this happen? It hinges on whether the OpenAI $20 billion contract lands as scheduled. According to the prospectus, about 15% of the remaining performance obligations will be recognized in 2026 and 2027, roughly $3.5 billion. At that pace, Cerebras's revenue could reach over $2 billion+ by 2027, potentially compressing the P/S ratio to a reasonable range. But any delay, any OpenAI strategic shift, any new major customer loss at any point would instantly make this valuation untenable.
Second, how wide is Cerebras's moat?
The architectural advantage of the WSE-3 is real, but how long will it last? Nvidia Vera Rubin, AMD MI400, and Google TPU v6 are all pushing forward. The generational replacement cycle in the chip industry is 18-24 months. If Cerebras slips by one cycle, its technical lead could be erased. While its R&D spending as a percentage of revenue is already high, the absolute amount remains orders of magnitude smaller compared to the giants.
The deeper question: Is the wafer-scale chip path a mainstream route that will be widely adopted, or a 'special forces' unit forever confined to niche scenarios? There's no definitive answer. The optimistic reply: When inference workloads grow from today's 30% to 70%+ of total AI compute, Cerebras's niche becomes the main battlefield. The pessimistic reply: As long as Nvidia boosts Rubin's inference performance, the niche will remain just a niche.
Third, Governance Structure and Geopolitical Risk
The prospectus discloses two easily overlooked but important things:
First, Cerebras employs a Class A/Class B dual-class share structure. Post-IPO, insiders will hold 99.2% of the voting rights. Even if the founding team's ownership of the float drops to 5% in the future, they still control the company. This means external minority shareholders have almost no say in corporate governance.
Second, the company discloses the existence of two 'material weaknesses in internal control over financial reporting'. As an emerging growth company, it can be exempt from SOX 404(b) auditor attestation for up to five years after the IPO. This is a red flag—not a massive one, but worth noting.
Geopolitically, CFIUS cleared the G42 voting rights issue this time, but export controls (permits for shipping CS-2, CS-3, CS-4 to the UAE) remain a long-term variable. The Trump administration's policy direction on AI chip exports to the Middle East is not yet fully stable. Any policy swing could reignite CBRS's tail risks.
Conclusion
For the CBRS IPO, as an event, it is the most noteworthy AI hardware capital event of 2026. It defines the valuation anchor for the AI infrastructure sector in the public markets. Its performance will influence the pricing of all related targets.
As a long-term holding, it is a typical 'high-potential, high-uncertainty' bet—wagering on the macro-narrative of 'inference is king' + the micro-execution of 'Cerebras leveraging OpenAI to achieve narrow-band dominance' + the valuation assumption that 'the market remains willing to pay a 95x P/S premium for AI hardware'. All three conditions must hold simultaneously for outsized returns; if any one breaks, the drawdown could be severe.
For institutional investors, the typical approach is not to chase on day one, but to wait for Q3 earnings reports, key customer progress updates, and valuation digestion. For retail investors, treating it as a small, tail-risk allocation within an AI hardware portfolio is acceptable; treating it as an all-in conviction stock requires re-reading the triple paradox above.
More worthy of attention than whether CBRS surges at tomorrow's open is the broader implication of this event: When a company with 86% of its revenue coming from two related UAE entities and still operating at a real loss can be priced by the market at $48.8 billion, this in itself tells everyone the extent of capital frenzy currently present in the AI infrastructure sector.





