I didn’t expect my last post to spark so much discussion. In essence, we’re all talking about the same thing—it’s just that our descriptions of the numbers differ slightly.
You’ve all heard the saying: sometimes the gap between people is bigger than the gap between humans and dogs. But this phrase was born before the current wave of AI.
Today, I’ll try to quantify this idea. The numbers are all rough estimates, just for fun—don’t take them too seriously.
Assume an elementary school student’s cognitive ability is 10 points, a PhD is 60 points, a university professor is 75 points, and Einstein is 100 points.
The gap between 10 and 100 points is indeed huge—a full 10x difference. It’s not wrong to call it the difference between humans and dogs.
By 2025, AI’s cognitive ability will be worth at least 40 points. Considering AI is a generalist, while PhDs and professors are usually specialists, AI’s value effectively doubles to at least 80 points.
So we have:
- Elementary student + AI = 90 points
- PhD + AI = 140 points
- Professor + AI = 155 points
- Einstein + AI = 180 points
With AI, the absolute gap between the elementary student and Einstein remains 90 points, but the relative gap shrinks from 10x to just 2x.
This is my point: AI is narrowing the gap between humans.
Some might object: “But an elementary student can’t develop AI like a professor can!”
It’s like in One Piece, where characters develop their Devil Fruit abilities differently. The same Gomu Gomu Fruit: Luffy at Gear 1 can’t beat Luffy years later at Gear 4 (a newbie vs. a seasoned expert).
True. If AI is worth 80 points:
- A casual user (e.g., asking occasional questions) might only harness 20 points.
- Someone highly skilled at using AI (e.g., intense vibe coding) might overclock it to 100 points.
So:
- Elementary student + AI newbie = 30 points
- Einstein + AI expert = 200 points
The gap widens from 90 to 170 points! So with AI, the gap between people actually increases!
This is the view of teachers Lao Bai and Alvin—and they’re not wrong.
But—and this is a big but—while our views seem conflicting, their core is similar. Why?
Because I assume AI will keep evolving:
First, it will get smarter.
Second, it will become easier to use.
2025 is just a transitional year. The further we go, the simpler it will be to become a prompt engineer. The barrier will lower until it’s “as easy as speaking.” Learning to use AI will get easier, not harder.
Assume AI gets smarter, reaching maybe 240 points. Utilization levels could range from low to high: 200, 240, 280 points.
Then:
- Elementary student: 10 + 200 = 210 points
- Einstein: 100 + 280 = 380 points
The gap is 170 points, but it’s not even 2x anymore—it’s just 1.8x. The absolute gap grew, but the relative gap shrank.
What about in 10 years? Super optimistically, assume AI’s cognition evolves to around 1000 points.
Then:
- Elementary student: 1010 points
- Einstein: 1100 points
(If this day comes) even Einstein can’t pull ahead of the elementary student.
Those who think AI has widened the gap between humans are seeing a temporary state—because AI is new, and people’s ability to leverage it varies widely now.
But AI has replaced writers, artists, dancers... profession after profession falls. Are you really worried AI won’t replace the trainers who teach “how to unlock 100% of AI’s potential”?
Come on—that’s AI’s home turf.
In the future, humans routinely harnessing 80%–120% of AI’s potential will be the norm, not the exception.
The smarter AI gets, the smaller human roles become, and the narrower the gaps between people.
It’s like two martial arts masters suddenly allowed to use rocket launchers. What difference does it make if one trained 10 years in fists and feet, and the other 15 years with a blade?







