6th Man Ventures Founder: Forget the 'Token vs. Equity' Debate, What Really Needs to Be Trusted?

marsbitPubblicato 2026-01-12Pubblicato ultima volta 2026-01-12

Introduzione

Mike Dudas, founder of The Block and 6th Man Ventures, argues that the debate between tokens and equity misses the point: the real question is what deserves trust. He suggests there is no one-size-fits-all answer to whether a "dual token + equity" structure works. Instead, the core principle is trusting a team that is not only exceptional but also long-term oriented, committed to building a founder-led, enduring business like Binance. Dudas notes that for application-layer projects requiring sustained leadership, tokens often underperform compared to equity. Many DeFi 1.0 founders have left their projects, which are now maintained by DAOs in "maintenance mode," struggling with slow and ineffective decision-making. Pure equity isn’t always superior either—tokens enable functions like fee discounts, staking for airdrops, and access rights, which equity can’t easily replicate. He proposes a hybrid model: an equity entity operates on a "cost-plus" basis to serve a token-driven protocol, aiming not to maximize its own profits but to maximize the token’s and ecosystem’s value. This requires high trust in the team, as token holders lack strong legal rights. Ultimately, success depends on the team’s capability, credibility, execution, vision, and action. The best tokens will thrive by 2026 if teams communicate well, conduct buybacks, enable substantive governance, and direct value to the token through utility.

Author: Mike Dudas, Founder of The Block and 6th Man Ventures

Compiled by: Ken, ChainCatcher

There is no simple or "one-size-fits-all" answer to whether a "dual token + equity" structure is feasible. But one core principle is that you must be confident that the team is not only absolutely excellent but also has a long-term mindset, committed to building a founder-led, enterprise-level business that can last for decades, like Changpeng Zhao.

I believe that for application-layer projects that require long-term leadership, in many cases, token mechanisms are actually inferior to equity structures. For example, you can see that many founders of DeFi 1.0 protocols have mostly left their projects, many of which are struggling and essentially being maintained by DAOs and other part-time contributors in "maintenance mode." It turns out that DAOs and token-weighted voting are not good mechanisms for making sound decisions for projects (especially at the application layer); they cannot make decisions quickly and lack the "founder-driven" level of knowledge and capability.

Of course, a pure equity model is not absolutely superior to tokens either. Binance is a strong example—tokens enabled them to offer transaction discounts, staking for airdrops, access rights, and other benefits related to the core business and blockchain, which equity ownership cannot clearly carry.

"Ownership tokens" also have their limitations and are currently difficult to apply directly within products or protocols. Decentralized applications and networks are fundamentally different from traditional companies (otherwise, what is the point of this industry?), and pure equity is clearly less flexible than tokens. Of course, "equity+" type token designs may emerge in the future, but this is not the current reality (and the lack of market structure legislation in the U.S. makes issuing pure equity-like tokens with direct value capture and legal rights still risky).

In short, you can envision a scenario (as Lighter describes): an equity entity operates on a "cost-plus" model, serving as an engine for a token-driven protocol. In this architecture, the goal of the equity entity is not profit maximization but rather maximizing the value of the protocol token and ecosystem. If this model works, it would be a huge benefit for token holders. Because you have a well-funded Labs entity (e.g., Lighter has a token treasury for long-term development), and the core team holds a significant amount of tokens, they have a strong incentive to drive token value (while maintaining the crypto-native and on-chain nature of the core token design, separating it from the structurally complex associated Labs entity).

In this model, you do need a high degree of trust in the team, because in most current cases, token holders do not have strong legal rights. Conversely, if you don't believe the team can execute and create value for the tokens they heavily invest in, why would you participate in the project in the first place?

Ultimately, it all comes down to the team's capability, credibility, execution, vision, and actions. The longer a great team stays in the market and delivers on their promises, the more their tokens will exhibit the "Lindy effect." As long as the team maintains good communication and clearly directs value to the token through buybacks, substantive governance, and utility in the underlying protocol, we will see the highest-quality tokens—even those with equity/Labs entities—explode in 2026.

Domande pertinenti

QAccording to Mike Dudas, what is the core principle for evaluating the 'dual token + equity' structure in crypto projects?

AThe core principle is that you must be confident the team is not only absolutely excellent but also long-term oriented, committed to building a founder-led, enterprise-grade business that can last for decades, similar to Changpeng Zhao's approach.

QWhy does the author believe that token mechanisms are inferior to equity structures for application-layer projects requiring long-term leadership?

AHe argues that many DeFi 1.0 protocol founders have left their projects, which are now struggling and maintained in 'maintenance mode' by DAOs and part-time contributors. DAOs and token-weighted voting are not good mechanisms for making swift, high-quality decisions at the application layer, as they lack founder-driven knowledge and capability.

QWhat advantage does the author highlight about tokens compared to pure equity, using Binance as an example?

AHe points out that tokens enabled Binance to offer transaction fee discounts, staking for airdrops, access rights, and other blockchain-related benefits that equity ownership cannot clearly provide.

QWhat is the 'cost-plus' model described in the article for a potential project structure?

AIt's a model where an equity entity operates as an engine serving a token-driven protocol on a 'cost-plus' basis. The goal of the equity entity is not to maximize its own profit but to maximize the value of the protocol's token and ecosystem.

QWhat does the author say is ultimately the most critical factor for a project's success, regardless of its token or equity structure?

AHe states that everything ultimately depends on the team's capability, credibility, execution, vision, and actions. A great team that delivers on its promises over time will see its token gain a 'Lindy effect,' and the highest quality tokens will thrive by 2026 if the team directs value to the token through buybacks, governance, and utility.

Letture associate

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

In recent months, the rapid growth of the AI industry has attracted significant talent from the crypto sector. A persistent question among researchers intersecting both fields is whether blockchain can become a foundational part of AI infrastructure. While many previous AI and Crypto projects focused on application layers (like AI Agents, on-chain reasoning, data markets, and compute rentals), few achieved viable commercial models. Gensyn differentiates itself by targeting the most critical and expensive layer of AI: model training. Gensyn aims to organize globally distributed GPU resources into an open AI training network. Developers can submit training tasks, nodes provide computational power, and the network verifies results while distributing incentives. The core issue addressed is not decentralization for its own sake, but the increasing centralization of compute power among tech giants. In the era of large models, access to GPUs (like the H100) has become a decisive bottleneck, dictating the pace of AI development. Major AI companies are heavily dependent on large cloud providers for compute resources. Gensyn's approach is significant for several reasons: 1) It operates at the core infrastructure layer (model training), the most resource-intensive and technically demanding part of the AI value chain. 2) It proposes a more open, collaborative model for compute, potentially increasing resource utilization by dynamically pooling idle GPUs, similar to early cloud computing logic. 3) Its technical moat lies in solving complex challenges like verifying training results, ensuring node honesty, and maintaining reliability in a distributed environment—making it more of a deep-tech infrastructure company. 4) It targets a validated, high-growth market with genuine demand, rather than pursuing blockchain integration without purpose. Ultimately, the boundaries between Crypto and AI are blurring. AI requires global resource coordination, incentive mechanisms, and collaborative systems—areas where crypto-native solutions excel. Gensyn represents a step toward making advanced training capabilities more accessible and collaborative, moving beyond a niche controlled by a few giants. If successful, it could evolve into a fundamental piece of AI infrastructure, where the most enduring value in the AI era is often created.

marsbit9 h fa

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

marsbit9 h fa

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

A US researcher's visit to China's top AI labs reveals distinct cultural and organizational factors driving China's rapid AI development. While talent, data, and compute are similar to the West, Chinese labs excel through a pragmatic, execution-focused culture: less emphasis on individual stardom and conceptual debate, and more on teamwork, engineering optimization, and mastering the full tech stack. A key advantage is the integration of young students and researchers who approach model-building with fresh perspectives and low ego, prioritizing collective progress over personal credit. This contrasts with the US culture of self-promotion and "star scientist" narratives. Chinese labs also exhibit a strong "build, don't buy" mentality, preferring to develop core capabilities—like data pipelines and environments—in-house rather than relying on external services. The ecosystem feels more collaborative than tribal, with mutual respect among labs. While government support exists, its scale is unclear, and technical decisions appear driven by labs, not state mandates. Chinese companies across sectors, from platforms to consumer tech, are building their own foundational models to control their tech destiny, reflecting a broader cultural drive for technological sovereignty. Demand for AI is emerging, with spending patterns potentially mirroring cloud infrastructure more than traditional SaaS. Despite challenges like a less mature data industry and GPU shortages, Chinese labs are propelled by vast talent, rapid iteration, and deep integration with the open-source community. The competition is evolving beyond a pure model race into a contest of organizational execution, developer ecosystems, and industrial pragmatism.

marsbit10 h fa

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

marsbit10 h fa

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

Corning, a 175-year-old glass company, is experiencing a dramatic revival as a key player in AI infrastructure, driven by surging demand for high-performance optical fiber in data centers. AI data centers require vastly more fiber than traditional ones—5 to 10 times as much per rack—to handle high-speed data transmission between GPUs. This structural demand shift, coupled with supply constraints from the lengthy expansion cycle for fiber preforms, has created a significant supply-demand gap. Nvidia has invested in Corning, along with Lumentum and Coherent, in a $4.5 billion total commitment to secure the optical supply chain for AI. Corning's competitive edge lies in its expertise in producing ultra-low-loss, high-density, and bend-resistant specialty fiber, which is critical for 800G+ and future 1.6T data rates. Its deep involvement in co-packaged optics (CPO) with partners like Nvidia further solidifies its position. While not the largest fiber manufacturer globally, Corning's revenue from enterprise/data center clients now exceeds 40% of its optical communications sales, and it has secured multi-year supply agreements with major hyperscalers including Meta and Nvidia. Financially, Corning's optical communications revenue has surged, doubling from $1.3 billion in 2023 to over $3 billion in 2025. Its stock price has risen nearly 6-fold since late 2023. Key future catalysts include the rollout of Nvidia's CPO products and the scale of undisclosed customer agreements. However, risks include high current valuations and potential disruption from next-generation technologies like hollow-core fiber. The company's long-term bet on light over electricity, maintained even through the telecom bubble crash, is now being validated by the AI boom.

marsbit11 h fa

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

marsbit11 h fa

Trading

Spot
Futures
活动图片