Perspective: The current AI supercycle will last 15 years, but most are still buying stocks in the first FOMO stage

marsbitPublished on 2026-05-09Last updated on 2026-05-09

Abstract

This article outlines a 15-year AI supercycle, segmented into four investment stages. It argues that while most investors are still focused on the first stage, smart money is already moving to the third. **Stage 1: The Foundation (2023-2025) - Priced In** The semiconductor layer (e.g., NVIDIA, AMD) is complete. While growth continues, the historic entry opportunity is over as risk/reward has compressed. **Stage 2: The Build-Out (2025-2027) - In Progress** This phase involves building the necessary physical infrastructure: power/utilities (CEG), cooling (VRT), networking (ANET), and nuclear SMRs (OKLO, SMR). Significant upside remains, but obvious names have already moved. **Stage 3: The Asymmetric Bet (2026-2028) - Positioning Window** AI moves into the physical world. Key areas include robotics/autonomy (Tesla Optimus), space/defense/drones (Rocket Lab, LUNR), and critical materials. This stage presents the best asymmetric risk/reward and is where positioning should occur now. **Stage 4: The Endgame (2028+) - Software Dominance** The mega-cap cloud platforms (Microsoft, Alphabet, Amazon, Meta), with their massive capital expenditure, will build the AI software layer and AGI infrastructure, aiming to win the entire cycle. **Core Conclusion:** The cycle is confirmed in Stage 2. Stage 3 (robotics, space, defense, nuclear SMRs) is where capital is currently rotating for maximum opportunity, while the majority of investors are expected to be 12 months behind this shift.

Author: Rand Group (@cryptorand)

Compiled by: Deep Tide TechFlow

Deep Tide Intro: Crypto KOL Rand Group breaks down the AI supercycle into four stages, from chips to infrastructure to robots to platform software, marking the core targets and risk-reward ratios for each stage. His judgment is: Stage 1 (Semiconductors) is over, Stage 2 (Power/Cooling/Networks) is being priced, and the true asymmetric opportunity lies in Stage 3 — robotics, space, defense, nuclear energy.

The AI supercycle will last 15 years. This is year three.

Most investors are still buying Stage 1 stocks, but smart money is already rotating into Stage 3.

I've broken the entire cycle into four stages, with the most important tickers labeled for each.

The AI supercycle is the biggest investment theme of this generation. Bigger than mobile internet, bigger than cloud computing. A 15-year structural shift that will reshape every industry in the global economy. Hyperscale cloud providers just committed $725 billion in capex for 2026, nearly double last year's. Microsoft, Google, Amazon, Meta — each over $100 billion individually.

This is not speculation.

🔴 Stage 1: Over (2023-2025)

The foundation layer is complete. AMD was up 78% in 2025, NVDA up 39%, Intel just delivered a blowout Q1, pushing the Philadelphia Semiconductor Index above 10,000 for the first time. Chips still drive every stage, but the historic entry opportunity is gone; the risk-reward has compressed.

Tickers: NVDA, AMD, ARM, INTC, AVGO, MU, GLW

Sectors: Semiconductors, Memory, Photonics/Optics

Status: Foundation complete, still growing, but priced in.

🟡 Stage 2: Buildout Peak (2025-2027)

The stage most investors are just waking up to. CEG acquiring Calpine to become the largest private U.S. power producer at 55 GW. GEV up over 200% in a year. VRT co-designing cooling for NVIDIA's Rubin architecture. GLW up 74% YTD on fiber demand. Nuclear SMR is the biggest dark horse — OKLO, SMR, BWXT are laying direct power lines for data centers.

Still upside, but the most obvious names have moved.

Tickers: CEG, GEV, VRT, VST, TLN, ANET, GLW, MOD, EQIX, OKLO, SMR, BWXT, NNE

Sectors: Power/Grid, Cooling, Networking, Nuclear SMR Buildout Peak

Note: Nuclear SMR is the hidden major opportunity.

🟡 Stage 3: Positioning Window (2026-2028)

The stage where AI leaves the data center and enters the physical world. Most will be late.

Tesla is converting its Fremont factory into an Optimus robot production line — $25 billion capex, targeting mass production in H2 2026. Rocket Lab posted a record $602M revenue, backlog at $1.85B. LUNR up 47% YTD with $943M in contracts. KTOS's Valkyrie drone selected by the Marine Corps.

The positioning window is open now.

Tickers: TSLA, RKLB, LUNR, KTOS, AVAV, PATH, ISRG, MP, FCX, ALB, ASTS

Sectors: Robotics/Autonomy, Space/Defense/Drones, Rare Earths

Judgment: The asymmetric risk-reward is here.

🟢 Stage 4: Endgame (2028+)

The endgame. Microsoft capex $190B, Alphabet $190B, Amazon $200B, Meta $145B. Google Cloud backlog exceeds $460B. They are building AI software dominance and AGI infrastructure. Quantum computing is early, but IONQ and D-Wave are laying the groundwork.

The platforms controlling the software layer win the entire supercycle.

Tickers: MSFT, GOOGL, AMZN, META, ORCL, IONQ

Sectors: AI Software Dominance, AGI Infrastructure, Decade-long thesis

Strategy: Buy the dips.

Key Conclusions

  • Stage 2 is confirmed (hyperscale $725B capex)
  • Stage 3 is where smart money is positioning — robotics, space, defense, nuclear
  • SMR is the core trade from 2026 to 2028
  • Most will rotate into these names 12 months late

A 15-year supercycle. Not a single trade. Stage 1 is over, Stage 2 is being priced, Stage 3 is where you should be.

Related Questions

QAccording to the article, what is the author's view on the duration and current stage of the AI supercycle?

AThe author believes the AI supercycle will last 15 years and is currently in its third year. The first stage (2023-2025) is already over, the second stage (2025-2027) is in progress, and the third stage (2026-2028) is where the most asymmetric opportunities lie.

QWhat sectors or 'stages' does the author identify as having the best asymmetric risk/reward opportunity right now?

AThe author identifies the third stage as having the best asymmetric risk/reward opportunity. This stage includes robotics/autonomous systems, space/defense/drones, and rare earths, with specific mentions of companies like Tesla, Rocket Lab, and Intuitive Machines.

QWhich specific sector within the 'construction boom' (second stage) is highlighted as a major hidden opportunity?

AWithin the second stage (the construction boom), the author highlights nuclear energy, specifically Small Modular Reactors (SMRs), as the major hidden opportunity. Companies mentioned include Oklo, NuScale Power (SMR), and BWX Technologies.

QWhat key metric is cited as evidence confirming the transition to the second stage of the AI supercycle?

AThe author cites the $725 billion in committed capital expenditure for 2026 by hyperscale cloud providers (Microsoft, Google, Amazon, Meta) as the key metric confirming the transition to the second stage. This amount is nearly double that of the previous year.

QWhat does the author suggest is the strategy for investing in the 'Endgame' (fourth stage) companies?

AFor the 'Endgame' or fourth stage companies (like Microsoft, Alphabet, Amazon, Meta), which control the AI software platform, the author's suggested strategy is to 'buy the dips,' indicating a long-term, patient accumulation approach.

Related Reads

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

In recent months, the rapid growth of the AI industry has attracted significant talent from the crypto sector. A persistent question among researchers intersecting both fields is whether blockchain can become a foundational part of AI infrastructure. While many previous AI and Crypto projects focused on application layers (like AI Agents, on-chain reasoning, data markets, and compute rentals), few achieved viable commercial models. Gensyn differentiates itself by targeting the most critical and expensive layer of AI: model training. Gensyn aims to organize globally distributed GPU resources into an open AI training network. Developers can submit training tasks, nodes provide computational power, and the network verifies results while distributing incentives. The core issue addressed is not decentralization for its own sake, but the increasing centralization of compute power among tech giants. In the era of large models, access to GPUs (like the H100) has become a decisive bottleneck, dictating the pace of AI development. Major AI companies are heavily dependent on large cloud providers for compute resources. Gensyn's approach is significant for several reasons: 1) It operates at the core infrastructure layer (model training), the most resource-intensive and technically demanding part of the AI value chain. 2) It proposes a more open, collaborative model for compute, potentially increasing resource utilization by dynamically pooling idle GPUs, similar to early cloud computing logic. 3) Its technical moat lies in solving complex challenges like verifying training results, ensuring node honesty, and maintaining reliability in a distributed environment—making it more of a deep-tech infrastructure company. 4) It targets a validated, high-growth market with genuine demand, rather than pursuing blockchain integration without purpose. Ultimately, the boundaries between Crypto and AI are blurring. AI requires global resource coordination, incentive mechanisms, and collaborative systems—areas where crypto-native solutions excel. Gensyn represents a step toward making advanced training capabilities more accessible and collaborative, moving beyond a niche controlled by a few giants. If successful, it could evolve into a fundamental piece of AI infrastructure, where the most enduring value in the AI era is often created.

marsbit3h ago

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

marsbit3h ago

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

A US researcher's visit to China's top AI labs reveals distinct cultural and organizational factors driving China's rapid AI development. While talent, data, and compute are similar to the West, Chinese labs excel through a pragmatic, execution-focused culture: less emphasis on individual stardom and conceptual debate, and more on teamwork, engineering optimization, and mastering the full tech stack. A key advantage is the integration of young students and researchers who approach model-building with fresh perspectives and low ego, prioritizing collective progress over personal credit. This contrasts with the US culture of self-promotion and "star scientist" narratives. Chinese labs also exhibit a strong "build, don't buy" mentality, preferring to develop core capabilities—like data pipelines and environments—in-house rather than relying on external services. The ecosystem feels more collaborative than tribal, with mutual respect among labs. While government support exists, its scale is unclear, and technical decisions appear driven by labs, not state mandates. Chinese companies across sectors, from platforms to consumer tech, are building their own foundational models to control their tech destiny, reflecting a broader cultural drive for technological sovereignty. Demand for AI is emerging, with spending patterns potentially mirroring cloud infrastructure more than traditional SaaS. Despite challenges like a less mature data industry and GPU shortages, Chinese labs are propelled by vast talent, rapid iteration, and deep integration with the open-source community. The competition is evolving beyond a pure model race into a contest of organizational execution, developer ecosystems, and industrial pragmatism.

marsbit5h ago

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

marsbit5h ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

Corning, a 175-year-old glass company, is experiencing a dramatic revival as a key player in AI infrastructure, driven by surging demand for high-performance optical fiber in data centers. AI data centers require vastly more fiber than traditional ones—5 to 10 times as much per rack—to handle high-speed data transmission between GPUs. This structural demand shift, coupled with supply constraints from the lengthy expansion cycle for fiber preforms, has created a significant supply-demand gap. Nvidia has invested in Corning, along with Lumentum and Coherent, in a $4.5 billion total commitment to secure the optical supply chain for AI. Corning's competitive edge lies in its expertise in producing ultra-low-loss, high-density, and bend-resistant specialty fiber, which is critical for 800G+ and future 1.6T data rates. Its deep involvement in co-packaged optics (CPO) with partners like Nvidia further solidifies its position. While not the largest fiber manufacturer globally, Corning's revenue from enterprise/data center clients now exceeds 40% of its optical communications sales, and it has secured multi-year supply agreements with major hyperscalers including Meta and Nvidia. Financially, Corning's optical communications revenue has surged, doubling from $1.3 billion in 2023 to over $3 billion in 2025. Its stock price has risen nearly 6-fold since late 2023. Key future catalysts include the rollout of Nvidia's CPO products and the scale of undisclosed customer agreements. However, risks include high current valuations and potential disruption from next-generation technologies like hollow-core fiber. The company's long-term bet on light over electricity, maintained even through the telecom bubble crash, is now being validated by the AI boom.

marsbit5h ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

marsbit5h ago

Trading

Spot
Futures

Hot Articles

Discussions

Welcome to the HTX Community. Here, you can stay informed about the latest platform developments and gain access to professional market insights. Users' opinions on the price of AI (AI) are presented below.

活动图片