Tokenized assets hit $21B, but are new chains starting to matter?

ambcrypto2026-01-23 tarihinde yayınlandı2026-01-23 tarihinde güncellendi

Özet

Tokenized real-world assets (RWAs) have reached a total value locked (TVL) of $21 billion, with U.S. Treasury debt making up the largest portion at over $9 billion. Commodities and private credit follow at $3.7 billion and $2.5 billion, respectively. While Ethereum remains the dominant platform, hosting nearly $200 billion in tokenized value—primarily in stablecoins—other chains like Arbitrum are gaining attention. Despite Ethereum's early advantages in liquidity and infrastructure, the RWA market is projected to expand significantly, with estimates ranging from $2-4 trillion to as high as $16 trillion by 2030. The question remains whether new chains will challenge Ethereum's leading position in the future.

Tokenized real-world assets (RWAs) have gained great ground, with their total value locked (TVL) now crossing $21 billion. While Ethereum [ETH] hosts the bulk of these assets, relatively smaller networks like Arbitrum [ARB] have attracted eyeballs too.

Beyond niche status

According to the latest data, US Treasury debt dominates the $21 billion tokenized RWAs TVL, accounting for over $9 billion. It’s followed by commodities at around $3.7 billion and private credit at roughly $2.5 billion.

Corporate bonds and institutional funds also made up a growing share, while real estate and private equity were smaller but present.

Beyond current numbers, McKinsey has estimated that tokenized assets could reach $2-4 trillion by 2030. Furthermore, Boston Consulting Group has forecasted a much larger $16 trillion market.

There’s definitely more room for expansion.

Ethereum is the place to be

While the RWA market is still relatively small, most tokenized assets today are on Ethereum. According to Token Terminal, the network hosts close to $200 billion worth of tokenized value across stablecoins, tokenized funds, commodities, and stocks.

As it stands, stablecoins make up the largest share by a wide margin – Far outweighing other categories.

The numbers make Ethereum’s early lead in tokenization infrastructure obvious. Liquidity, a mature ecosystem, and developer support have helped it become the preferred choice for RWAs so far.

But, will this dominance last?

New RWA demand may be forming elsewhere...

İlgili Sorular

QWhat is the current total value locked (TVL) for tokenized real-world assets (RWAs)?

AThe total value locked (TVL) for tokenized real-world assets has crossed $21 billion.

QWhich type of tokenized RWA has the largest market share according to the latest data?

AUS Treasury debt dominates the tokenized RWAs TVL, accounting for over $9 billion.

QWhich blockchain network currently hosts the majority of tokenized assets?

AEthereum hosts the bulk of these assets, with close to $200 billion worth of tokenized value across various categories.

QWhat are the future market size estimates for tokenized assets by 2030 according to the mentioned consulting firms?

AMcKinsey estimated tokenized assets could reach $2-4 trillion by 2030, while Boston Consulting Group forecasted a much larger $16 trillion market.

QWhat factors have contributed to Ethereum becoming the preferred choice for RWAs so far?

ALiquidity, a mature ecosystem, and developer support have helped Ethereum become the preferred choice for RWAs.

İlgili Okumalar

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

In recent months, the rapid growth of the AI industry has attracted significant talent from the crypto sector. A persistent question among researchers intersecting both fields is whether blockchain can become a foundational part of AI infrastructure. While many previous AI and Crypto projects focused on application layers (like AI Agents, on-chain reasoning, data markets, and compute rentals), few achieved viable commercial models. Gensyn differentiates itself by targeting the most critical and expensive layer of AI: model training. Gensyn aims to organize globally distributed GPU resources into an open AI training network. Developers can submit training tasks, nodes provide computational power, and the network verifies results while distributing incentives. The core issue addressed is not decentralization for its own sake, but the increasing centralization of compute power among tech giants. In the era of large models, access to GPUs (like the H100) has become a decisive bottleneck, dictating the pace of AI development. Major AI companies are heavily dependent on large cloud providers for compute resources. Gensyn's approach is significant for several reasons: 1) It operates at the core infrastructure layer (model training), the most resource-intensive and technically demanding part of the AI value chain. 2) It proposes a more open, collaborative model for compute, potentially increasing resource utilization by dynamically pooling idle GPUs, similar to early cloud computing logic. 3) Its technical moat lies in solving complex challenges like verifying training results, ensuring node honesty, and maintaining reliability in a distributed environment—making it more of a deep-tech infrastructure company. 4) It targets a validated, high-growth market with genuine demand, rather than pursuing blockchain integration without purpose. Ultimately, the boundaries between Crypto and AI are blurring. AI requires global resource coordination, incentive mechanisms, and collaborative systems—areas where crypto-native solutions excel. Gensyn represents a step toward making advanced training capabilities more accessible and collaborative, moving beyond a niche controlled by a few giants. If successful, it could evolve into a fundamental piece of AI infrastructure, where the most enduring value in the AI era is often created.

marsbit10 saat önce

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

marsbit10 saat önce

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

A US researcher's visit to China's top AI labs reveals distinct cultural and organizational factors driving China's rapid AI development. While talent, data, and compute are similar to the West, Chinese labs excel through a pragmatic, execution-focused culture: less emphasis on individual stardom and conceptual debate, and more on teamwork, engineering optimization, and mastering the full tech stack. A key advantage is the integration of young students and researchers who approach model-building with fresh perspectives and low ego, prioritizing collective progress over personal credit. This contrasts with the US culture of self-promotion and "star scientist" narratives. Chinese labs also exhibit a strong "build, don't buy" mentality, preferring to develop core capabilities—like data pipelines and environments—in-house rather than relying on external services. The ecosystem feels more collaborative than tribal, with mutual respect among labs. While government support exists, its scale is unclear, and technical decisions appear driven by labs, not state mandates. Chinese companies across sectors, from platforms to consumer tech, are building their own foundational models to control their tech destiny, reflecting a broader cultural drive for technological sovereignty. Demand for AI is emerging, with spending patterns potentially mirroring cloud infrastructure more than traditional SaaS. Despite challenges like a less mature data industry and GPU shortages, Chinese labs are propelled by vast talent, rapid iteration, and deep integration with the open-source community. The competition is evolving beyond a pure model race into a contest of organizational execution, developer ecosystems, and industrial pragmatism.

marsbit12 saat önce

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

marsbit12 saat önce

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

Corning, a 175-year-old glass company, is experiencing a dramatic revival as a key player in AI infrastructure, driven by surging demand for high-performance optical fiber in data centers. AI data centers require vastly more fiber than traditional ones—5 to 10 times as much per rack—to handle high-speed data transmission between GPUs. This structural demand shift, coupled with supply constraints from the lengthy expansion cycle for fiber preforms, has created a significant supply-demand gap. Nvidia has invested in Corning, along with Lumentum and Coherent, in a $4.5 billion total commitment to secure the optical supply chain for AI. Corning's competitive edge lies in its expertise in producing ultra-low-loss, high-density, and bend-resistant specialty fiber, which is critical for 800G+ and future 1.6T data rates. Its deep involvement in co-packaged optics (CPO) with partners like Nvidia further solidifies its position. While not the largest fiber manufacturer globally, Corning's revenue from enterprise/data center clients now exceeds 40% of its optical communications sales, and it has secured multi-year supply agreements with major hyperscalers including Meta and Nvidia. Financially, Corning's optical communications revenue has surged, doubling from $1.3 billion in 2023 to over $3 billion in 2025. Its stock price has risen nearly 6-fold since late 2023. Key future catalysts include the rollout of Nvidia's CPO products and the scale of undisclosed customer agreements. However, risks include high current valuations and potential disruption from next-generation technologies like hollow-core fiber. The company's long-term bet on light over electricity, maintained even through the telecom bubble crash, is now being validated by the AI boom.

marsbit12 saat önce

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

marsbit12 saat önce

İşlemler

Spot
Futures
活动图片