Author: Jesse
This is the latest thinking from independent analyst @farzyness, who has 360,000 followers. He began investing in Tesla in 2012 and was part of the Tesla leadership team from 2017 to 2021.
One person simultaneously owns a battery company, an AI company, and a rocket company, and they are all feeding each other.
I've been thinking about this for months, and honestly, I really can't see how Musk loses. This isn't from a "fanboy" perspective, but from a structural one. The Tesla-xAI-SpaceX triangle is evolving into something unprecedented: an industrial-scale, synergistic, cash-generating flywheel behemoth. While it sounds convoluted, it's described very accurately.
Let me break down what's really happening here, because I think most people are looking at these companies in isolation, while the real focus is on the connections between them.
1. The Starting Point of the Flywheel: Energy
Tesla makes batteries, and in massive quantities. They deployed 46.7 gigawatt-hours (GWh) of energy storage in 2025, a year-on-year increase of 48.7%. Their 50 GWh factory under construction in Houston will be operational this year. The total planned capacity is 133 GWh per year. The gross margin for this business is 31.4%, compared to just 16.1% for the automotive business. This "boring" energy storage business generates almost twice the profit per dollar of revenue as cars.
Why is this important? Because xAI just purchased $375 million worth of Tesla Megapacks to power Colossus, the world's largest AI training facility. 336 Megapacks have already been deployed. These batteries provide backup power and demand response capabilities for this system, which boasts 555,000 GPUs and consumes over 1 gigawatt of power (enough for 750,000 households).
2. Moving Beyond NVIDIA: Chip Independence
Tesla not only sells batteries but is also developing its own AI chips.
Currently, NVIDIA monopolizes AI training hardware, controlling about 80% of the market. All major AI labs (OpenAI, Google, Anthropic, Meta) are scrambling for NVIDIA's allocation. The H100 and now the Blackwell chips are the bottleneck for the entire industry. Jensen Huang's pricing power is the stuff of most monopolists' dreams.
If you're Musk and want to build the world's largest AI system, what do you do? You can't rely on NVIDIA forever. That's a critical point of failure, a lever in someone else's hands, especially when you plan to power hundreds of millions of robots in the next 10 to 20 years.
By the way, Musk's Tesla plan is: to manufacture as many robots as there are humans.
Tesla's AI5 is set to launch between the end of this year and 2027. Musk claims it will be the world's most powerful inference chip, especially in terms of cost per computation. In other words, it will be extremely efficient.
The AI6 chip has already signed a $16.5 billion foundry contract with Samsung. The key point: Musk stated AI6 is designed for "Optimus robots and data centers." This means Tesla products and xAI products will share the same chips.
NVIDIA currently wins at "training," but "inference" is the long-term profit maker. Training happens only once, but inference occurs every time someone uses a model. If you're running millions of Tesla cars, millions of Optimus robots, and billions of Grok queries, inference is where the real compute demand lies.
By building their own inference chips, Tesla and xAI are "decoupling" while NVIDIA focuses on training. It's like flanking the heavily fortified front line.
3. Space-Based AI Computing
Musk mentioned "space-based AI computing" in the Tesla Dojo 3 roadmap. They restarted the Dojo 3 project precisely for this vision. Do the math, and this crazy-sounding idea becomes rational.
If you want to deploy 1 terawatt of AI compute power in space annually (the scale of global AI infrastructure), according to Musk, at current chip costs, you'd need more money than exists in the total money supply. The NVIDIA H100 sells for $25,000 to $40,000; it's economically unfeasible.
But if you own extremely low-cost chips, specifically designed for inference, mass-produced, and ultra-energy-efficient, the math changes. Tesla's goal is to make AI chips with the "lowest cost silicon." This is key to enabling large-scale space computing.
Without cheap chips, space AI is a fantasy; with cheap chips, it becomes inevitable.
The NVIDIA-backed StarCloud trained its first AI model in space last December. This proves the concept is viable. So now the focus isn't on validating the hypothesis, but on creating an environment for mass deployment.
Imagine this: SpaceX launches orbital data centers into low Earth orbit via Starship, each rocket carrying 100 to 150 tons. These data centers run models developed by xAI, using chips designed by Tesla, powered by solar energy and Tesla batteries. Free solar energy, zero-cost cooling. Inference results are beamed directly to Tesla cars and Optimus robots on Earth via Starlink.
4. The Closed Loop of Data and Connectivity
SpaceX already has nearly ten thousand Starlink satellites in orbit and is approved to launch 7,500 more. They have 6 million direct-to-cell customers. The V3 satellites launched this year have a downlink capacity of 1 terabit per second (1 Tbps), 10 times that of the previous generation.
The flywheel spins madly here:
-
xAI builds models (Grok 3 has 3 trillion parameters, Grok 4 won a global benchmark, the 6 trillion parameter Grok 5 is due Q1 2026).
-
These models go into Tesla cars. Grok has been live in vehicles since July 2025, providing conversation and navigation, and the car's self-driving uses the same Tesla chips.
-
Grok will become the "brain" for Optimus robots. Optimus plans to produce 50,000 to 100,000 units this year, reaching 1 million by 2027.
This means: xAI models, Tesla makes chips, Tesla builds robots to execute, Tesla makes batteries for power, SpaceX provides global connectivity and space access, xAI trains on the full dataset from Tesla and X, and commands are issued from solar-powered satellites in space.
5. The Unbreachable Moat
This moat is inevitable.
-
Tesla has 7.1 billion miles of FSD driving data, more than 50 times that of Waymo. Real-world data trains better models, better models improve vehicle performance, better vehicles collect more data.
-
X (formerly Twitter): xAI has exclusive access to real-time human data from about 600 million monthly active users. This is different from YouTube or search data; it's raw, unstructured, real-time human thought. When Grok hallucinates, they can correct it against the real-time consensus faster than anyone.
What can competitors do to catch up?
-
Google has vertical integration (TPU chips, Gemini, YouTube), but Waymo's scale is too small, and it lacks launch vehicles and a real-time social data stream.
-
Microsoft has Copilot and Azure, but relies on OpenAI, has no physical hardware, no space infrastructure, no autonomous driving data.
-
Amazon has AWS, custom chips, and logistics robots, but lacks a mass-adopted consumer AI product, no fleet of cars, no launch capability.
-
NVIDIA, while monopolizing training, has no "physical layer." They have no cars or factory robots collecting data, no global satellite network. They sell chips but don't control the end points.
To compete with Musk, you'd need to simultaneously found or acquire five different top-tier companies, and he's cementing his advantage every day.
Conclusion
Most analysts view Tesla, xAI, and SpaceX as separate investments, but this is completely wrong. The value isn't in the individual parts, but in how they feed each other.
xAI is valued at $250 billion, SpaceX is valued at around $800 billion and seeking an IPO at $1.5 trillion, Tesla is valued at $1.2 trillion. Total enterprise value exceeds $2 trillion, and the synergy premium isn't even factored in.
Every piece enhances another:
-
Tesla succeeds, xAI gets more training data.
-
xAI succeeds, Tesla cars and robots get smarter.
-
SpaceX succeeds, the whole system has global coverage.
-
Energy business succeeds, lower power costs for all facilities.
-
Chip strategy succeeds, independence from NVIDIA.
-
Optimus succeeds, a potential addressable market (TAM) of over $40 trillion annually in the labor market.
Did I miss anything? If you can see a flaw I'm missing, I'd love to hear it. Because after watching this for so long, I genuinely can't find one.