The Escalation of the Computing Power War: When 'Crypto Mines' Become 'AI Factories', A New Arena for Energy Arbitrage

marsbitPublished on 2026-03-04Last updated on 2026-03-04

Abstract

The computing landscape has dramatically shifted by early 2026, with Bitcoin mining operations transforming into essential "AI factories." This transition is driven by a global scarcity of power, not just chips, turning pre-existing energized land into a monopolistic infrastructure asset. Former miners, now infrastructure capitalists, leverage their secured power and land—a critical advantage given the 5–7 year wait for new substations. Building AI-ready facilities has become capital-intensive, costing $8–11 million per megawatt, creating a clear divide between scaled leaders like Iris Energy (2910 MW portfolio) and execution-focused firms like TeraWulf and Hut 8, which have secured multi-billion dollar contracts. A key shift is the "hyperscale guarantor" model, where tech giants like Google and Microsoft provide credit backing, transforming risky miner leases into investment-grade contracts. This enables favorable debt financing at ~7.125% interest from major banks. Technologically, high-density liquid cooling is mandatory for platforms like NVIDIA’s Blackwell, which consumes 120 kW per rack. Innovations like Shanghai’s submerged data centers (PUE 1.15) use seawater cooling, reducing power use by 40–60%. The Blackwell supply backlog acts as a moat, locking out late entrants. Companies like CoreWeave, with early chip orders, dominate. The industry has matured into an energy-transition play, treating computation—whether Bitcoin or AI—as an interchangeable output of power a...

Author: Eli5DeFi

Compiled by: AididiaoJP, Foresight News

Looking back from the rearview mirror of 2024, the Bitcoin mining industry seemed like a group of survivalists trudging through difficult times, having to cope with the Bitcoin halving event while enduring the lingering chill of the "crypto winter".

But by early 2026, that impression was completely overturned. The industry had undergone a fundamental transformation, evolving from speculative outposts of computing power into the cornerstone of a new era—the "AI factory".

Driving this change was a brutal battle for resources.

As global demand for AI computing power reached a fever pitch, the bottleneck had shifted from "not enough chips" to "not enough power". High-performance computing requires something that cannot be downloaded or manufactured quickly: land that is already connected to the electrical grid.

Those Bitcoin miners, once mocked as volatile and unreliable, successfully transformed the land and power resources they secured around 2021 into the infrastructure monopoly capital of 2026, reinventing themselves as indispensable "landlords" in the AI gold rush.

The Great Compute Flip

In the landscape of 2026, electricity became the new scarce resource.

The primary "physical moat" protecting the industry's winners was the utility power interconnection point. With new substation construction now taking 5 to 7 years, those already-electrified holy sites—the old mining farms already connected to the grid—became the only places that could meet the immediate demands of cutting-edge AI model training.

However, the barrier to entry had shifted from simple "land grabs" to capital-intensive fortresses. Due to high-density liquid cooling requirements and a global transformer shortage, the cost of building an AI-ready facility had skyrocketed to approximately $8 to $11 million per megawatt. This high capital expenditure threshold drew a clear line between the "execution leaders" and other players:

  • Iris Energy (IREN): The industry scale leader, valued at $14 billion. It possesses a 2,910-megawatt power and land portfolio, underpinning its expanding "AI factory" footprint.
  • Riot Platforms: Holds 1.7 gigawatts of approved power capacity. Riot transformed its "Texas Triangle" assets into strategic hosting centers, recently signing a landmark lease with AMD.
  • TeraWulf and Hut 8: Recognized execution leaders. These companies secured contracts worth $6.7 billion and $7 billion respectively, successfully converting mining farms into high-value, investment-grade AI assets.

"Hyperscaler Guarantees" — The End of Crypto Volatility?

Perhaps the most profound change was the structural re-rating of the business model, enabled by "credit enhancement".

In the past, top financial institutions were reluctant to lend to miners due to Bitcoin's high price volatility. This changed with the advent of the "hyperscaler guarantee".

Through "take-or-pay agreements", industry giants like Google and Microsoft now provide financial guarantees for the rents paid by these former miners.

Thus, what was once a high-risk miner lease contract became a low-risk tech giant credit contract. The result: the industry gained access to the bond market at favorable interest rates of around 7.125%. Companies like Cipher Mining and Hut 8 could obtain non-dilutive project financing covering up to 85% of project costs from J.P. Morgan and Goldman Sachs. This "landlord" model with its "take-or-pay" clauses attracted massive capital from institutions like Vanguard, Oaktree, and Citadel.

Blackwell Reality and Subsea Data Centers

The technical requirements of AI in 2026 rendered the old air-cooled mining rig designs not only obsolete but entirely unusable for deploying high-density AI clusters.

The NVIDIA Blackwell GB200 NVL72 platform, consuming 120 kilowatts per rack, forced the industry to adopt direct-to-chip liquid cooling.

To address both cooling and land scarcity simultaneously, the industry turned its gaze to the "blue economy". Shanghai's Lingang 2.0 project is a prime example of a commercial-scale subsea data center.

  • Technical Specs: The facility achieves a Power Usage Effectiveness (PUE) of 1.15, far exceeding the national target of 1.25. It utilizes seawater as the primary cooling source, reducing total power consumption by 40-60%.
  • Precision Deployment: Using GPS-guided vessels like the "Sanhang Fengfan", these 1,300-ton subsea modules can be submerged with zero-error precision, powered by offshore wind farms, completely freeing them from terrestrial resource constraints.

The "Blackwell Moat" and Hardware Holders

By 2026, a "supply chain wall" had solidified the industry's hierarchy. With NVIDIA's Blackwell architecture chips sold out until mid-2026, a company's orders placed back in 2024 became its competitive moat today.

No chips, power is useless; no power, chips are just bricks. The winners were those who locked down both power and chips early.

CoreWeave's preparation for a public listing with a $35 billion valuation is backed by its massive hardware orders, including a $22.4 billion commitment from OpenAI. Latecomers who failed to secure chips during the 2024 window were essentially locked out of the primary market for AI infrastructure.

"A backlog of 3.6 million units for the Blackwell architecture effectively locks latecomers out of the primary AI infrastructure market, a situation unlikely to change for the foreseeable future." — Jensen Huang, CEO of NVIDIA, 2026.

Beyond the Miner

The transition from "Bitcoin factory" to "AI digital infrastructure hub" marks the maturation of a once-marginal industry and its integration as a key component of global industrial policy.

The isolated, pure-play mining model is nearing its end. It is being replaced by industrial-scale energy transformation companies. They view computing—whether it's Bitcoin SHA-256 algorithm or large language model training—as an interchangeable output from their core power assets, allocated on demand.

As these gigawatt-scale "AI factories" become permanent fixtures of the power grid, we are compelled to ask:

Can a pure mining model without AI diversification survive given the vast disparity in revenue per megawatt? More importantly, how will global grids adapt as these facilities shift from flexible power consumers (mines) to stable, baseload-demanding AI "foundation loads"? Then, data centers will no longer be mere power customers but designers and architects of the grid itself.

The machines have changed, but this high-stakes game of energy arbitrage has only just begun.

Related Questions

QWhat is the main transformation that the Bitcoin mining industry has undergone by early 2026 according to the article?

AThe Bitcoin mining industry has transformed from a volatile, speculative 'crypto mining' operation into a foundational 'AI factory,' repurposing its secured land and power resources into critical infrastructure for the AI boom.

QWhat has become the new scarce resource and primary 'moat' for industry leaders in 2026?

AElectricity has become the new scarce resource. The primary 'physical moat' is the utility power interconnection point, with new substation construction taking 5-7 years, making existing powered sites from old mines the only viable locations for immediate AI demand.

QHow has the business model structural re-rating been achieved, and what is a key financial instrument enabling it?

AThe re-rating has been achieved through 'credit enhancement' via 'hyperscale guarantor' agreements. Tech giants like Google and Microsoft provide financial guarantees for the rent paid by former miners, turning risky contracts into low-risk credit contracts. This allows the industry to access project financing at favorable rates (e.g., ~7.125%) with non-dilutive capital from major institutions.

QWhat major technological shift was forced by the power requirements of NVIDIA's Blackwell platform, and what innovative solution is being deployed to address cooling and land scarcity?

AThe NVIDIA Blackwell GB200 NVL72 platform's high power draw of 120 kW per rack forced a shift to direct-to-chip liquid cooling. To address cooling and land scarcity, the industry is looking to the 'blue economy,' with commercial-scale underwater data centers like Shanghai's Lingang 2.0 project, which uses seawater for cooling and achieves a high power usage effectiveness (PUE) of 1.15.

QWhat is the 'Blackwell Moat,' and how does it create a barrier to entry for new players in the AI infrastructure market?

AThe 'Blackwell Moat' refers to the competitive barrier created by the massive backlog of orders for NVIDIA's Blackwell architecture chips (3.6 million units), which are sold out until mid-2026. Companies that secured chip and power supply orders early (e.g., in the 2024 window) are the winners, as new entrants are effectively locked out of the primary market for AI infrastructure for the foreseeable future.

Related Reads

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

In a span of four days, Amazon announced an additional $25 billion investment, and Google pledged up to $40 billion—both direct competitors pouring over $65 billion into the same AI startup, Anthropic. Rather than a typical venture capital move, this signals the latest escalation in the cloud wars. The core of the deal is not equity but compute pre-orders: Anthropic must spend the majority of these funds on AWS and Google Cloud services and chips, effectively locking in massive future compute consumption. This reflects a shift in cloud market dynamics—enterprises now choose cloud providers based on which hosts the best AI models, not just price or stability. With OpenAI deeply tied to Microsoft, Anthropic’s Claude has become the only viable strategic asset for Google and Amazon to remain competitive. Anthropic’s annualized revenue has surged to $30 billion, and it is expanding into verticals like biotech, positioning itself as a cross-industry AI infrastructure layer. However, this funding comes with constraints: Anthropic’s independence is challenged as it balances two rival investors, its safety-first narrative faces pressure from regulatory scrutiny, and its path to IPO introduces new financial pressures. Globally, this accelerates a "tri-polar" closed-loop structure in AI infrastructure, with Microsoft-OpenAI, Google-Anthropic, and Amazon-Anthropic forming exclusive model-cloud alliances. In contrast, China’s landscape differs—investments like Alibaba and Tencent backing open-source model firm DeepSeek reflect a more decoupled approach, though closed-source models from major cloud providers still dominate. The $65 billion bet is ultimately about securing a seat at the table in an AI-defined future—where missing the model layer means losing the cloud war.

marsbit1h ago

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

marsbit1h ago

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

DeepSeek-V4 has been released as a preview open-source model, featuring 1 million tokens of context length as a baseline capability—previously a premium feature locked behind enterprise paywalls by major overseas AI firms. The official announcement, however, openly acknowledges computational constraints, particularly limited service throughput for the high-end DeepSeek-V4-Pro version due to restricted high-end computing power. Rather than competing on pure scale, DeepSeek adopts a pragmatic approach that balances algorithmic innovation with hardware realities in China’s AI ecosystem. The V4-Pro model uses a highly sparse architecture with 1.6T total parameters but only activates 49B during inference. It performs strongly in agentic coding, knowledge-intensive tasks, and STEM reasoning, competing closely with top-tier closed models like Gemini Pro 3.1 and Claude Opus 4.6 in certain scenarios. A key strategic product is the Flash edition, with 284B total parameters but only 13B activated—making it cost-effective and accessible for mid- and low-tier hardware, including domestic AI chips from Huawei (Ascend), Cambricon, and Hygon. This design supports broader adoption across developers and SMEs while stimulating China's domestic semiconductor ecosystem. Despite facing talent outflow and intense competition in user traffic—with rivals like Doubao and Qianwen leading in monthly active users—DeepSeek has maintained technical momentum. The release also comes amid reports of a new funding round targeting a valuation exceeding $10 billion, potentially setting a new record in China’s LLM sector. Ultimately, DeepSeek-V4 represents a shift toward open yet realistic infrastructure development in the constrained compute landscape of Chinese AI, emphasizing engineering efficiency and domestic hardware compatibility over pure model scale.

marsbit1h ago

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

marsbit1h ago

Trading

Spot
Futures
活动图片