When Dreamers Start Doing the Math: How Reality Interrupted OpenAI's $500 Billion Stargate?

比推Publicado a 2026-02-24Actualizado a 2026-02-24

Resumen

OpenAI's ambitious $500 billion "Stargate" data center project has stalled due to financing challenges and internal disagreements with partners SoftBank and Oracle. After lenders rejected OpenAI's request for billions to build its own data centers—citing unproven business models and cash burn concerns—the company was forced to scale back its plans. It has now shifted from owning infrastructure to leasing capacity, signing major deals with Oracle and others. OpenAI has also slashed its 2030 computing expenditure target by 57%, from $1.4 trillion to $600 billion, and projects $280 billion in revenue by 2030. Despite having over 900 million users, only about 5% pay for ChatGPT, and competition is intensifying. Facing high costs and slower monetization, OpenAI is prioritizing financial sustainability over grand visions.

Author: Ada

Original title: Waking Moment: When Dreamer OpenAI Starts Doing the Math


A company valued at hundreds of billions of dollars wants to borrow billions to build houses.

The lenders say: No.

The reason is straightforward: Your business model hasn't been validated, and analysts predict you might burn through cash by mid-2027. How will you repay?

This isn't a funding mishap of some startup. This is OpenAI's real situation in 2025.

According to an exclusive report by The Information, OpenAI sent executives across the U.S. to scout locations, planning to build its own data centers and attempting to raise billions to start construction. They were turned down by lending institutions. Tom's Hardware, citing analyst judgments, reported that OpenAI could run out of cash as early as mid-2027.

A year ago, Sam Altman stood beside the podium at the White House and announced the Stargate plan: $500 billion, four years, building the world's largest AI data center network with SoftBank and Oracle. Trump called it "the largest AI infrastructure project in history."

A year later, this joint venture hasn't formed a team, hasn't developed any data centers, and the three partners haven't even agreed on who is responsible for what. What OpenAI wanted to build itself, it couldn't.

So, OpenAI started doing the math.

The $500 Billion Dream Shattered on "Who's in Charge"

The Information's report reveals a story that has been festering behind the spotlight for a full year.

Weeks after the White House press conference, Stargate was paralyzed. No one was in charge, no coordination mechanism existed. OpenAI, Oracle, and SoftBank were locked in repeated tug-of-wars over "who builds, who manages, how to split the money."

OpenAI wanted to build its own data centers; this was its initial obsession. The logic made sense: Renting computing power long-term is too expensive; only by building itself can it control its destiny.

But lenders didn't see it that way.

A company that burned through $2.5 billion in cash in half a year and is projected to burn $8.5 billion for the full year comes asking to borrow billions to build data centers? Lenders don't look at your PPT; they look at your cash flow. And OpenAI itself predicts it won't achieve positive cash flow until 2029 at the earliest.

This is equivalent to someone who hasn't started making money yet going to a bank to apply for a loan to build a villa. The bank's first question is, how will you repay? He couldn't answer.

The self-build route was blocked. OpenAI was forced back to the negotiating table to continue talks with its Stargate partners.

But the negotiations were equally tough. SoftBank has several large data center projects in Texas; OpenAI wanted to take one of them as its first facility. SoftBank disagreed, wanting to retain control. OpenAI's team flew to Japan multiple times in September and October to negotiate face-to-face with Masayoshi Son.

The final negotiation result: OpenAI signs a long-term lease, controls the design; SoftBank's SB Energy is responsible for development and ownership.

In other words, OpenAI went from wanting to be a landlord to becoming a tenant.

$800 Billion Evaporated

If the internal chaos of Stargate was a hidden wound, the next number is a public self-correction.

According to CNBC, OpenAI has lowered its total computing power expenditure target before 2030 to about $600 billion, accompanied by a clearer timeline and revenue forecasts. It expects revenue to exceed $280 billion by 2030, split evenly between consumer and enterprise segments.

Cut from $1.4 trillion to $600 billion, a 57% reduction.

The official statement: "To better align spending with revenue growth."

The real meaning: Investors aren't buying it.

The previous number was more like a wish list; the $600 billion is at least a number that can be modeled. But even so, to achieve over $280 billion in revenue by 2030, it would require a compound annual growth rate of over 50% for five consecutive years. Who can guarantee that?

OpenAI's 2025 revenue was $13.1 billion, burning $8 billion. It's still far from profitable. The company itself expects cash flow to turn positive only in 2029. Before that, cumulative losses could reach $115 billion.

This is the sound of waking up.

It's not that Altman doesn't want to spend $1.4 trillion. It's that reality is telling him: You can't afford it.

The Ledger Can't Support the Dream

Why did OpenAI have to change from a dreamer to an accountant? Not because it made some strategic mistake, but because three cold, hard facts arrived simultaneously.

First, money is going out much faster than it's coming in.

OpenAI's first-half 2025 revenue was $4.3 billion, burning $2.5 billion in cash. Full-year revenue $13.1 billion, burning $8 billion. According to Fortune, citing investor documents, the company expects losses to widen annually, with operating losses potentially reaching $74 billion by 2028, and not turning positive until 2029 or 2030. Cumulative losses are projected at $115 billion.

OpenAI's current state is spending money at ten times the speed, but making money at only twice the speed. Mathematically, these lines will cross eventually; the only difference is whether they cross in 2029 or never.

Second, whether computing efficiency can offset scale expansion. Although OpenAI's "compute margin" (revenue minus model running costs) improved from 52% in October 2024 to 70% in October 2025, thanks to algorithm optimization and hardware utilization improvements, every time a larger model or more compute-intensive feature (like video generation) is launched, these efficiency gains get eaten up.

Third, paid conversion rate is stuck.

ChatGPT weekly active users exceed 900 million. But according to Incremys data, the paid conversion rate is only about 5%, with over 95% of users on the free tier. OpenAI has started testing ads in the free version. This act itself is a signal: when you start charging for user attention, it means you've hit the ceiling of the subscription model.

Meanwhile, competitors are taking users with less money. According to Similarweb data, ChatGPT's global traffic share dropped from 87% to about 65% within a year. Google Gemini, relying on Android default integration and Workspace embedding, surged from 5% to 21%, not by having a stronger model, but by distribution dominance. Anthropic's Claude, with 2% traffic share, has the highest user engagement (34.7 minutes daily average), taking a high-end enterprise route, burning money at a fraction of OpenAI's rate.

"ChatGPT created this category, but when alternatives emerged, users naturally dispersed," said Tom Grant, Vice President of Research at Apptopia.

And competitors are doing the same thing with less money. DeepSeek is stirring the market with open-source models and extremely low costs. Google is crushing with distribution. Anthropic is acquiring high-value customers with a focused strategy. If AI models tend to converge in functionality, what ultimately decides the market is not whose model is strongest, but whose ecosystem is deepest and costs are lowest.

OpenAI is trying to fight three wars simultaneously: the model race, the infrastructure race, and the commercialization race. But historically, no company has won on all three fronts.

Altman's Plan B

The dream is shattered, but Altman hasn't stopped.

He did something all business textbooks would recommend but few dreamers are willing to do: abandon the obsession and survive pragmatically.

The dream of building its own data centers was abandoned. The replacement strategy is to sign大量 contracts outside the Stargate framework. Sign a $30 billion annual compute procurement agreement with Oracle, deepen cooperation with CoreWeave, and even fill gaps with AWS and Google Cloud. Chip supply is also diversifying, introducing AMD and startup Cerebras besides Nvidia.

OpenAI's CFO Sarah Friar said publicly at the Davos Forum that the company is intentionally using partners to protect its balance sheet.

This statement would have been unthinkable a year ago. Back then, Altman talked about trillion-dollar infrastructure commitments, 10GW of compute capacity, artificial general intelligence that changes human destiny. Now his CFO is talking about "protecting the balance sheet."

But OpenAI's fundraising scale is still astonishing; the latest round could exceed $100 billion. According to Bloomberg, OpenAI is close to completing the first phase of a new funding round, with the company's overall valuation, including the financing, potentially exceeding $850 billion. Expected participating companies include Amazon (expected investment $50 billion), SoftBank ($30 billion), Nvidia ($20 billion), and Microsoft.

But note the identities of these investors: chip suppliers, cloud platforms, and strategic investors requiring OpenAI to use their services. This isn't venture capital betting on a dream; it's more like supply chain上下游 locking in a big customer.

Investing in OpenAI used to be buying a lottery ticket; now investing in OpenAI is signing a supply contract. The nature has completely changed.

Gravity

Let's pull the lens back to Stargate.

A year ago, on the stage of that White House press conference, Sam Altman stood in the center and announced the $500 billion "Stargate" plan.

A year later, the joint venture in this plan is a mess of arguments. OpenAI bypassed the joint venture framework it initiated and signed a separate agreement with Oracle. The computing power target wasn't met; 10GW only achieved 7.5GW. Spending expectations were cut from $1.4 trillion to $600 billion.

This isn't a story of failure. OpenAI hasn't fallen; it's still raising money, still growing, users are still over 900 million.

But it is a story of waking up.

From "wanting to build the world's largest data center empire" to "ensuring survival first, then fighting using others' money and infrastructure." From wanting to be a landlord to becoming a tenant. From a dreamer to an accountant.

Faced with the受阻 progress of the "Stargate" project, Elon Musk coldly tossed out a comment on X: "Hardware is hard."

Although harsh, it points to a reality all AI companies will eventually face. At this stage of the computing arms race, the real threshold is no longer who trained the strongest model. It's who can physically place gigawatt-scale infrastructure到位 in the physical world without burning themselves out.

Altman chose not to burn himself out. This might be the least sexy, but most correct decision he has made.

As for that $500 billion Stargate dream, it's not dead, but it's no longer what it was a year ago. It has changed from a narrative about changing human destiny to a balance sheet that needs to be checked line by line.


Twitter:https://twitter.com/BitpushNewsCN

Bitpush TG Discussion Group:https://t.me/BitPushCommunity

Bitpush TG Subscription: https://t.me/bitpush

Original link:https://www.bitpush.news/articles/7614109

Preguntas relacionadas

QWhat was OpenAI's ambitious 'Stargate' project, and why did it face significant setbacks according to the article?

AOpenAI's 'Stargate' was a $500 billion plan announced in 2024 to build the world's largest AI data center network in partnership with SoftBank and Oracle. It faced setbacks because the partners couldn't agree on who would build, manage, and fund it. Furthermore, lenders refused to finance OpenAI's plan to build its own data centers due to its unproven business model and high cash burn rate, forcing the company to scale back its ambitions and become a tenant rather than an owner of such infrastructure.

QWhat major financial adjustment did OpenAI make to its computing infrastructure spending plans, and what was the reason for this change?

AOpenAI significantly reduced its pre-2030 computing infrastructure spending target from $1.4 trillion to approximately $600 billion, a 57% decrease. The official reason was to 'better align expenditure with revenue growth.' The real reason, as stated in the article, was that investors did not accept the original, more ambitious figure, and the company needed a more realistic financial model that it could sustain.

QWhat are the three key financial and market challenges that forced OpenAI to become more pragmatic, as outlined in the article?

AThe three key challenges are: 1. Money is going out much faster than it is coming in, with a high cash burn rate and projected cumulative losses of $115 billion before potential cash flow positivity in 2029. 2. Uncertainty over whether gains in computing efficiency can keep up with the costs of scaling to larger, more powerful models. 3. A stalled paid conversion rate for ChatGPT (around 5%), increasing competition eroding its market share, and rivals achieving similar results with lower costs.

QHow did OpenAI's strategy for acquiring computing power change after its 'Stargate' plans and loan rejections?

AAfter its plan to build its own data centers was rejected by lenders and the 'Stargate' partnership stalled, OpenAI shifted to a more pragmatic strategy. It abandoned its dream of ownership and instead signed massive procurement agreements with partners like Oracle (a $30 billion annual compute deal), CoreWeave, and Google Cloud. It also diversified its chip suppliers beyond NVIDIA to include AMD and Cerebras. This strategy was described as using partners' money and infrastructure to protect its own balance sheet.

QWhat does the article suggest is the nature of OpenAI's latest potential funding round, and how does it differ from previous investments?

AThe article suggests that OpenAI's latest potential funding round, which could exceed $100 billion and value the company at over $850 billion, has a different nature than previous investments. Potential investors like Amazon, SoftBank, NVIDIA, and Microsoft are largely chip suppliers, cloud platforms, and strategic investors requiring OpenAI to use their services. This is characterized not as venture capital betting on a dream, but more as supply chain partners locking in a major client, signifying a shift from a high-risk gamble to a strategic commercial relationship.

Lecturas Relacionadas

Trading

Spot
Futuros
活动图片