Against the backdrop of the global open-source large model market being long dominated by Chinese tech companies, American tech giants are attempting to reclaim influence through differentiated competition.
According to media reports, Google DeepMind CEO Demis Hassabis recently hinted on social media with a "four diamonds" icon that the new-generation open-source large model Gemma 4 is about to be officially released. This comes exactly one year after the launch of the previous product, Gemma 3, aligning with Google's iteration pace in the large model domain.
Major Spec Upgrade: New 120B Model Challenges the Limits of Local Operation
Compared to its predecessor, Gemma 4 achieves a leap in parameter scale:
Quadrupled Parameters: Rumors suggest a new large model with 120B parameters will be introduced, four times the size of the previous generation.
MoE Architecture: To balance performance and efficiency, the model is expected to adopt a Mixture of Experts (MoE) architecture, with activated parameters of only 15B. This means that even large-parameter models could potentially run locally offline on consumer-grade graphics cards.
Capability Evolution: Predictions indicate that Gemma 4's context processing ability will improve by 1 to 2 times, with deeper logical reasoning and complex task execution capabilities.
Strategic Game: Containing the "Chinese Force" in the Open-Source Community
Kuai Technology analysis points out that although the current focus of American giants has shifted to closed-source business models, Google is rhythmically releasing technological dividends to prevent Chinese companies from fully dominating the open-source ecosystem:
Time Gap Strategy: Google chooses to release the open-source version more than half a year after its main closed-source model, the Gemini 3.0 series, thereby maintaining commercial returns from closed-source models while preserving influence in the developer community through open-source projects.
Localization Moat: The core positioning of Gemma 4 remains "localized service." By optimizing the performance of lightweight models, Google aims to directly compete with domestic open-source models through exceptional on-device experience without touching its core commercial interests.
Industry Observation: The Open-Source Track Enters an Era of "Parameters and Efficiency" Dual Competition
With the addition of Gemma 4, the competition threshold for open-source large models is further raised. The industry widely believes that although Google's priority on open-source is not the highest, its profound algorithmic积淀 (accumulation) remains a variable that cannot be underestimated. Whether Gemma 4 can surpass the current domestic open-source "flagship" models with the same parameter count will be a focal point for the global AI community in the second half of the year.








