Editor's Note: When people talk about AI, attention is often focused on the most obvious aspects: chatbots, AI assistants, and various new applications. However, behind these products, a deeper industrial restructuring is taking place. From power and chips to data centers, and then to models and applications, AI is actually a technology stack composed of multiple layers of infrastructure, and the flow of capital and profits is far more complex than it appears on the surface.
This article systematically examines this value chain from the perspective of the "Five-Layer AI Structure": why trillions of dollars are flowing into energy, chips, and cloud infrastructure; why model companies are still burning massive amounts of cash despite rapid growth; and where the real value may first concentrate in this technological revolution.
By comparing AI with historical cycles like the electricity revolution and internet infrastructure construction, the author attempts to answer a key question: in this technological wave that may reshape the global industrial structure, where is capital flowing, and how can ordinary people participate in this round of AI wealth opportunities.
The following is the original text:
Most people think AI is just a chatbot.
I can understand that thought. You open ChatGPT, ask it to revise an email, and it does so instantly. It feels like magic. So you close the tab, thinking you understand what AI is all about. But this is like using a Visa credit card at a restaurant once and then thinking you understand how Visa makes money. You just used the product; you didn't see the system behind it.
For most of last year, I was trying to figure out where the real profits in AI are actually flowing. And the slightly awkward truth is: it took me a long time to realize I was looking at the wrong layer. I kept staring at ChatGPT, Claude, Gemini—the things you can directly interact with.
Meanwhile, $700 billion was quietly flowing into another set of infrastructure I couldn't even name: chips I'd never heard of, packaging technology acronyms that sound made up, cooling systems, power plants. In Texas, Iowa, and Hyderabad, vast amounts of concrete are being poured to build data centers.
A year ago, almost no one around me was talking about these things. Now, everyone is.
This article will be quite long. If you don't have time to read it all, you can save it for later.
I want to take you through the entire AI value chain: starting with the electricity that powers data centers, all the way to the applications on your phone.
And I'll explain it in a way that even if you've never read a public company's annual report in your life, you can understand. I'll explain all the terms; I'll back every judgment with real data; and for the parts I'm still unsure about, I'll be honest about that too, because there certainly are some.
Let's begin.
I. The Five-Layer Cake (Why Nobody Discusses the Bottom Four Layers)
AI is infrastructure. Like the internet, like electricity, it requires factories. —Jensen Huang
Most people understand AI like this: a smart computer that answers questions.
This is like saying the internet is "a place to watch videos." Technically not wrong, but completely misses the point.
At the World Economic Forum in January 2026, Jensen Huang described AI as a five-layer system:
Energy
Chips
Cloud
Models
Applications
He called this entire system: "The largest infrastructure construction in human history."
Think about that word first: Infrastructure.
Roads. Power grids. Water supply systems. These things make modern civilization work, but people usually only notice them when they break.
AI is becoming the same thing: invisible, indispensable, and extremely expensive to build. I call this entire structure the AI Stack. It consists of five layers, stacked one on top of the other, each supporting the one above, and capital flows bidirectionally between these layers.
The simplest version I can give is this:
Energy: You need electricity to run computers, and a lot of it.
Chips: You need specialized processors for computation. These are not the CPUs in your laptop.
Cloud: You need massive warehouse-sized data centers filled with these chips and connected by extremely high-speed networks.
Models: You need the actual AI software—the "intelligent brain" that learns patterns from data.
Applications: You need the products people actually use, like ChatGPT, Google Search, or a bank's anti-fraud system.
Any discussion of AI that only talks about the fifth layer (Applications) ignores a full 80% of reality. And if you are an investor, entrepreneur, or just someone trying to understand where the world is heading, the truly important point is that money is not distributed evenly among these five layers. It concentrates, compounds, and flows to a very few critical nodes.
And today, this capital is concentrating in places most people haven't even noticed.
II. Tracking the Flow of Capital (The Answer Isn't Where You Think)
People's attention almost always focuses on the application layer. ChatGPT, GitHub Copilot, Claude, Perplexity.
These are products you can use, so it's easy to think that the AI story is probably just these applications.
But most people overlook one thing. By 2026, the combined annual capital expenditures (CapEx) of the world's four largest cloud computing companies (Amazon, Microsoft, Google, Meta) are projected to reach $650 to $700 billion.
That's for one year, for four companies combined.
This number is roughly equivalent to Switzerland's entire GDP for a year. And about 75% of that, approximately $450 billion, will be directly invested in AI infrastructure.
Not chatbots, not applications. But buildings, chips, fiber optics and networking, cooling systems—things almost no one talks about at cocktail parties. This precisely indicates that the money is there.
Because think about it: before anyone can use ChatGPT, someone must first do one thing: build a shopping mall-sized data center, install tens of thousands of specialized processors inside, connect them with networking equipment worth far more than the market cap of most companies, and supply the entire system with enough power to run a small city. And it must run like this every day.
This is layers one to three: Energy, Chips, Cloud Infrastructure. These are the invisible layers, and where truly massive capital is being deployed.
One might ask: "What about OpenAI? Haven't they made billions already?"
Indeed they have.
By the end of 2025, OpenAI's annualized recurring revenue (ARR) had reached $20 billion. A year earlier it was $6 billion, and the year before that, $2 billion.
10x growth in two years. In the history of human business, very few companies have achieved such rapid revenue growth at this scale.
But the problem is, the costs are equally staggering.
2025: OpenAI burned approximately $9 billion in cash
2026: Projected to burn $17 billion
Just the inference cost, the cost of actually running the model when you ask it a question:
2025: $8.4 billion
2026 projected: $14.1 billion
Based on current projections, OpenAI may not achieve positive cash flow until 2029 or 2030.
So the question arises: where does all this burned money go?
The answer: it flows down the AI stack.
To:
Microsoft Azure (OpenAI is contractually obligated to pay Microsoft 20% of revenue until 2032)
Nvidia's GPUs
Engineering companies building data centers
And energy companies providing power
If you stare at this system long enough, you see an almost circular structure:
Microsoft invests in OpenAI
OpenAI uses that money to buy Azure cloud services
Azure uses the revenue to buy Nvidia chips
Nvidia reports record profits
Everyone applauds
And then, the capital continues to flow downward.
There is an important structural fact in the AI stack:
The vast majority of users are at the top (Application layer)
The vast majority of profits are at the bottom (Infrastructure layer)
And this misalignment between where users are and where profits are is the core of the entire AI investment logic.
This is the first rule of the AI value chain: Revenue flows up, capital settles down.
III. You've Actually Seen This Before
All human problems are essentially engineering problems, and engineering problems can ultimately be solved. —Buckminster Fuller
If you want to truly understand what's happening with AI, look back at the history of the electricity revolution between 1880 and 1920.
In 1882, Thomas Edison built the first commercial power station on Pearl Street in Manhattan, New York. At the time, most people thought electricity was just a novelty, a "fancier" way to light things. After all, gas lamps worked just fine. Who really needed this thing?
But in just 40 years, electricity completely reshaped almost every industry: manufacturing, transportation, communications, healthcare, entertainment.
The real winners of that revolution were not the people who invented the light bulb, but those who built the infrastructure: General Electric, Westinghouse Electric, power companies, copper mining companies, engineering and construction firms.
Today, AI is repeating the same pattern, just compressed into years instead of decades.
Compare the two chains:
AI system: AI → Data Centers → Chips → Raw Materials → Energy
Electricity system: Electricity → Factories → Machines → Raw Materials → Coal / Hydropower
The two paths are almost identical. And the winners, once again, are not primarily at the application layer, but at the infrastructure layer.
I call this phenomenon Infrastructure Gravity. Whenever a new computing platform emerges, the earliest wealth is always created by the "pick and shovel sellers."
Applications will catch up later, applications will get all the media attention. But infrastructure takes most of the profits.
For example, Nvidia's fiscal year 2026 (ended January 2026) revenue was $215.9 billion, up 65% year-over-year. The Data Center business alone generated $62.3 billion in revenue in the last quarter, up 75% year-over-year. This business now accounts for 91% of Nvidia's total revenue.
In other words, one company, $68 billion in quarterly revenue, 90% from the same business line.
Look at chip manufacturing. TSMC held about 70% of the global foundry market share in 2025, with sales of $122.5 billion. Second place Samsung Electronics had only 7.2%. This level of monopoly even makes Standard Oil back in the day seem less extreme.
Infrastructure always wins first. The real question is just how long this window will last.
Ask anyone what the internet revolution was, they'll say Google, Amazon, Facebook.
But if you ask where the earliest money was made, the answer is actually Cisco Systems, Corning, the companies laying the fiber optic networks.
The same story, just a different era.
IV. The Part No One Wants to Hear
The stock market is a device for transferring money from the impatient to the patient. —Charlie Munger
I have to confess something. When I first looked at AI as an investor, I made the same mistake as most people: I looked at the application layer. I saw ChatGPT's growth. Saw Anthropic raising billions. So I thought, AI companies will win, so invest in AI companies.
Later, three things changed my mind, and they happened in sequence.
Thing 1: The Hottest Companies Are Burning Cash
I found that almost all "AI companies" are burning cash like crazy. OpenAI, Anthropic, Mistral AI, xAI. All are spending money much faster than they are making it. The reason isn't a bad business model, but that compute costs are structural.
Every time you ask an AI a question, the system must perform real computation. Computation requires GPUs, GPUs require electricity. And the stronger the model, the higher the compute demand, so the operating costs only get higher.
In other words: the perceived winners in AI are actually the biggest spenders.
Thing 2: The Most Profitable Are at the Bottom
I noticed that infrastructure companies are printing money. Nvidia's gross margin is near 75%. TSMC is expanding capacity while raising prices because demand far exceeds supply.
These companies don't have a "when will we be profitable" problem. Their problem is, we simply can't build fast enough. These are two completely different problems.
Thing 3: Don't Think Like a "Consumer" (The Most Uncomfortable One)
I realized I had been thinking about AI like a consumer.
Consumers see applications. Engineers see the tech stack. Once you see the entire stack, you can't unsee it.
Every AI launch becomes a capital expenditure (CapEx) announcement. Every model upgrade becomes a new chip order. Every new feature becomes a new data center lease.
The whole industry starts to look like concentric circles: the closer to the center, the more concentrated the profits.
Maybe you are: a software engineer focused on AI models, a retail investor who bought Nvidia at $300, or someone watching this revolution from afar in India (or maybe you're all three—that's the most interesting position.)
Wherever you are, the principle is the same. Consumers see products; investors see supply chains. And the best investors see the supply chain that forms even before the product is announced.
V. Investor Map: A Layer-by-Layer Breakdown of the AI Stack
The article is already long, so I'll pick up the pace.
Below is the structure, key players, and potential opportunities for each layer of the AI Stack.
Layer 1: Energy
AI data centers are extremely power-hungry. A single large model training run can consume a small town's annual electricity usage. By 2026, global AI data centers are projected to consume about 90 terawatt-hours of electricity annually. That's roughly a 10x increase from 2022.
This leads to a very simple investment logic: whoever can supply stable power to data centers will benefit. This includes nuclear power companies, natural gas companies, renewable energy companies, grid operators, especially those near data center clusters.
Jensen Huang said in October 2025: The speed at which data centers build their own power generation might be faster than connecting to the grid. In fact, many tech companies are already building power generation facilities right next to their data centers, bypassing the grid.
This shocked me. These tech companies are becoming their own utility companies.
Beneficiaries include utility companies, independent power producers, power equipment manufacturers (transformers, switchgear, etc.). In Asia, for example India, power equipment and transmission companies will benefit as hyperscaler data centers expand.
Layer 2: Chips
This is the layer the public is most familiar with, thanks to Nvidia. But it's far more complex than one company.
The chip layer can be further subdivided:
Design Companies
Nvidia (GPU), AMD, Broadcom, Qualcomm
And increasingly, Cloud Provider In-House Chips: Google TPU, Amazon Trainium, Microsoft Maia
Manufacturing Companies
Almost monopolized by TSMC, ~70% market share, second place Samsung Electronics (7.2%). Intel is trying to rebuild its foundry business, but this will take years.
Equipment Companies
The machines that make chips come from ASML (the only company producing EUV lithography machines), as well as Applied Materials, Lam Research, Tokyo Electron.
Memory Companies
AI models require vast amounts of High-Bandwidth Memory (HBM). Key players: SK Hynix, Samsung, Micron Technology.
Packaging Technology
Advanced packaging technologies (e.g., TSMC's CoWoS) have become a new bottleneck.
The most shocking thing about this layer is actually the concentration:
Nvidia: ~92% AI GPU market share
TSMC: Manufactures almost all AI chips
ASML: Sole supplier of EUV equipment
One company designs. One company manufactures. One company makes the manufacturing machines. This concentration is both an investment opportunity and a geopolitical risk.
Layer 3: Cloud & Data Centers
This is where the chips actually run.
Massive warehouse-style facilities:
Tens of thousands of servers
High-speed network connections
Liquid cooling systems (have gone from optional to standard)
The market is dominated by the three major cloud providers:
Amazon Web Services (31%)
Microsoft Azure (24%)
Google Cloud (11%)
Oracle is also expanding rapidly, planning $50 billion in CapEx for 2026. But this layer is far more than just the hyperscalers.
For example:
Foxconn assembles 40% of AI servers
Arista Networks provides networking equipment
Credo Technology (stock up 117% in 2025)
Vertiv provides liquid cooling
Data Center Real Estate Companies:
Equinix
Digital Realty
Even concrete suppliers are part of it; there's a complete supply chain at every layer.
According to Bank of America estimates, hyperscalers will invest 90% of their operating cash flow into capital expenditures in 2026. This ratio was 65% in 2025.
Morgan Stanley expects these companies to issue over $400 billion in debt this year to build data centers. This figure was $165 billion in 2025.
When I first read that number, I paused. $400 billion in debt in one year, just to build more warehouses full of computers.
Layer 4: Models
This is the "brain layer," the companies responsible for training and building the actual AI models.
Key players include:
OpenAI (GPT series, >$20B annualized revenue)
Anthropic (Claude, reportedly ~$19B annualized revenue early 2026)
Google DeepMind (Gemini)
Meta AI (Llama, open-source models)
Mistral AI
xAI (developing Grok)
This layer fascinates me because it is simultaneously the most sought-after and the least profitable.
For example:
OpenAI's revenue growth is unprecedented, but it is still projected to burn $17 billion in cash in 2026.
Anthropic is growing just as fast but is highly dependent on funding—a $5 billion round in early 2026 valued it at around $170 billion.
The problem is a structural contradiction in the business model of this layer. Models get stronger, requiring more compute, and the cost of compute often grows faster than revenue.
It's a bit like running a restaurant where every new dish requires more expensive ingredients, but customers expect the price to stay the same.
The result is that profit margins are constantly squeezed.
When will this change? I'm not sure, maybe not in the short term.
For investors, this layer is high-risk, high-reward. The problem is, most companies are still private.
Therefore, exposure in public markets mainly comes through two channels:
Cloud Computing Companies
For example, Microsoft holds a significant stake in OpenAI and provides it with compute via Microsoft Azure.
Chip Companies
Because model training consumes their hardware heavily.
Layer 5: Applications
This is the layer you see every day. For example, ChatGPT, Google Search powered by Gemini, Microsoft Copilot features in Office, AI anti-fraud systems in banks, Netflix's recommendation algorithm, AI image enhancement on your phone.
The application layer is the broadest and most crowded layer. Thousands of startups and large enterprises compete here. Long-term, it will likely become the layer with the largest market size. Some predictions suggest the application layer market could exceed $2 trillion by the early 2030s.
But at the current stage, this layer is also the one with the thinnest profits and the most uncertain competition.
In this layer, true differentiation comes from data. Companies with unique, proprietary data will build lasting advantages.
For example:
Salesforce—enterprise CRM data
Bloomberg—financial market data
Epic Systems—medical records data
Companies holding such data moats can deeply fine-tune AI models, something general-purpose chatbots cannot do.
For investors, the application layer may ultimately offer the largest return potential, but it will also destroy the most capital.
Most AI startups will fail, and only a few survivors will experience exponential compound growth.
The most likely investment logic for the next 3 to 5 years is: bet on infrastructure now, bet on applications later. And the smartest money is already positioned this way.
The companies that will truly win in Layer 5 are likely those that have data others cannot access.
And interestingly, many of these companies don't even call themselves AI companies yet.
VI. AI Risk: "Isn't This Just a Bubble?"
An investor's worst enemy is likely himself. —Benjamin Graham
Let's address the most common question head-on. "What about the dot-com bubble? Isn't this the same thing? Massive infrastructure investment, no profits, everyone caught up in the hype."
This is a good question and deserves a serious answer.
The key difference is that during the dot-com bubble, companies were building infrastructure before the demand was truly there. Companies were frantically laying fiber optic networks, building server rooms, but real internet users were still on dial-up.
The result was that the infrastructure was built, but demand didn't truly materialize until 5 to 7 years later. In the interim, many companies went bankrupt.
By 2026, the demand for AI already exists. Nvidia's chips are sold out; TSMC's advanced packaging capacity is fully booked; cloud computing rental prices are rising, not falling. Meanwhile, OpenAI added 400 million weekly active users between March and October 2025. The models are being used.
Compute is being consumed. Customers are paying. This doesn't mean there's no risk. In fact, the risks are enormous, and I probably think about this more often than I care to admit.
Three points are particularly noteworthy.
Capital Misallocation Risk
In 2026, tech companies will spend over $650 billion on data centers.
If the growth rate of AI service revenue is insufficient to support these investments, many companies will face severe margin compression. Even Amazon's free cash flow could turn negative this year.
This is Amazon, the company that practically invented the cloud computing business model.
Supply Chain Concentration Risk
The AI supply chain is highly concentrated.
TSMC manufactures ~70% of global chips
ASML is the sole supplier of EUV lithography machines
Nvidia designs 92% of AI data center GPUs
Any major shock—geopolitical, natural disaster, competitive landscape change—could impact the entire AI industry chain.
For instance, a major earthquake in Hsinchu, Taiwan, could set global AI development back years. That thought should be unsettling.
The DeepSeek Variable
In January 2025, the Chinese AI lab DeepSeek released a model. Its performance was close to the frontier models, but its training cost was only a fraction.
This challenges a core assumption: that throwing more compute at the problem will always yield better AI.
If open-source and highly efficient models continue to close the gap, then the infrastructure investment logic weakens.
I don't think DeepSeek overturns the entire AI investment thesis. But it does introduce a variable that didn't exist. And such variables, once they appear, don't disappear.
But I always come back to a larger framework.
Consulting firm long-term projections: McKinsey & Company expects cumulative global data center investment to reach $6.7 trillion by 2030; PwC expects AI to contribute $15.7 trillion to global GDP by 2030; International Data Corporation (IDC) expects the cumulative economic impact of AI-related solutions to reach $22.3 trillion.
Even if these numbers are overestimated by 50%, we are still facing the largest technology-driven economic transformation since the internet. The question isn't the direction, but the scale.
I often hear people say: "I'm skeptical about AI."
Of course you can be.
You can be skeptical of model capabilities, skeptical of the timeline, but don't ignore the supply chain structure.
These are two completely different things. One is healthy rational skepticism, the other will make you miss the opportunity.
Five years from now, the winners of this cycle will look incredibly obvious.
History is always like that. And the key to the game right now is: understanding the structure before others see it clearly.
VII. Participating in the Game at the Right Level
Think of AI as a five-layer video game. Each layer is a different level.
Level 1: Energy
This is the newbie level. Important, straightforward, and almost impossible to lose if you play normally. Low risk, stable returns.
Like the quest NPCs in a game: they don't die, but keep giving rewards.
Level 2: Chips
This is the boss fight. The most concentrated power, the highest profits. But also, the highest technical risk, geopolitical risk.
Huge rewards, but Hard mode.
Level 3: Cloud
This is the multiplayer server, where all the action happens. Hyperscalers are like server admins; they take a cut from all transactions.
Level 4: Models
This is the PVP arena. Brutally competitive, incredibly fast-paced innovation.
Most players will be eliminated; only the best-equipped survive.
Level 5: Applications
This is the open-world map. Infinite possibilities, but no fixed rewards. You have to find your own quests.
The real Meta Strategy is simple. You don't need to play all the levels.
Most people will play Level 5 because it's the most visible. But the smartest money right now is grinding experience on Levels 2 and 3, because that's where the highest returns are at this stage.
Your position in the tech stack determines what you should focus on.
For Non-Tech People
You don't need to understand how a GPU works. You just need to know that someone must manufacture GPUs, someone must build data centers for them, someone must power them. And these companies are all public companies; you can read their financial reports.
For Tech People
You already know models are getting stronger. But you might underestimate one thing: the real bottlenecks are becoming the physical world: power, cooling, chip packaging. The AI competition of the next decade might be more about engineering problems than model architecture problems in papers.
For Investors
The AI value chain is actually five different trades. Different risks, different time horizons, different winners. Treating AI as one industry is like treating "tech" as one industry in 1998. The internal differences are huge.
This situation won't last forever. One day the infrastructure build-out will mature, the application layer will consolidate, and value will shift back upward.
The internet era was like that too. Ultimately, the ones who made the most money were Amazon, Google, Facebook, not the fiber optic companies and server manufacturers.
But AI isn't at that stage yet. It's still the infrastructure phase, the pick and shovel phase.
And right now, the shovels are making money hand over fist. Those who understand the full stack will see the signals before the inflection point happens.
Others will be surprised, again and again, by where the money is actually flowing.
Ten years from now, understanding the AI stack will be as fundamental as understanding a balance sheet.
Remember three things: Understand the tech stack. Map the layers. Track the capital flow.
That's the game.













