Tokenized assets hit $21B, but are new chains starting to matter?

ambcryptoОпубліковано о 2026-01-23Востаннє оновлено о 2026-01-23

Анотація

Tokenized real-world assets (RWAs) have reached a total value locked (TVL) of $21 billion, with U.S. Treasury debt making up the largest portion at over $9 billion. Commodities and private credit follow at $3.7 billion and $2.5 billion, respectively. While Ethereum remains the dominant platform, hosting nearly $200 billion in tokenized value—primarily in stablecoins—other chains like Arbitrum are gaining attention. Despite Ethereum's early advantages in liquidity and infrastructure, the RWA market is projected to expand significantly, with estimates ranging from $2-4 trillion to as high as $16 trillion by 2030. The question remains whether new chains will challenge Ethereum's leading position in the future.

Tokenized real-world assets (RWAs) have gained great ground, with their total value locked (TVL) now crossing $21 billion. While Ethereum [ETH] hosts the bulk of these assets, relatively smaller networks like Arbitrum [ARB] have attracted eyeballs too.

Beyond niche status

According to the latest data, US Treasury debt dominates the $21 billion tokenized RWAs TVL, accounting for over $9 billion. It’s followed by commodities at around $3.7 billion and private credit at roughly $2.5 billion.

Corporate bonds and institutional funds also made up a growing share, while real estate and private equity were smaller but present.

Beyond current numbers, McKinsey has estimated that tokenized assets could reach $2-4 trillion by 2030. Furthermore, Boston Consulting Group has forecasted a much larger $16 trillion market.

There’s definitely more room for expansion.

Ethereum is the place to be

While the RWA market is still relatively small, most tokenized assets today are on Ethereum. According to Token Terminal, the network hosts close to $200 billion worth of tokenized value across stablecoins, tokenized funds, commodities, and stocks.

As it stands, stablecoins make up the largest share by a wide margin – Far outweighing other categories.

The numbers make Ethereum’s early lead in tokenization infrastructure obvious. Liquidity, a mature ecosystem, and developer support have helped it become the preferred choice for RWAs so far.

But, will this dominance last?

New RWA demand may be forming elsewhere...

Пов'язані питання

QWhat is the current total value locked (TVL) for tokenized real-world assets (RWAs)?

AThe total value locked (TVL) for tokenized real-world assets has crossed $21 billion.

QWhich type of tokenized RWA has the largest market share according to the latest data?

AUS Treasury debt dominates the tokenized RWAs TVL, accounting for over $9 billion.

QWhich blockchain network currently hosts the majority of tokenized assets?

AEthereum hosts the bulk of these assets, with close to $200 billion worth of tokenized value across various categories.

QWhat are the future market size estimates for tokenized assets by 2030 according to the mentioned consulting firms?

AMcKinsey estimated tokenized assets could reach $2-4 trillion by 2030, while Boston Consulting Group forecasted a much larger $16 trillion market.

QWhat factors have contributed to Ethereum becoming the preferred choice for RWAs so far?

ALiquidity, a mature ecosystem, and developer support have helped Ethereum become the preferred choice for RWAs.

Пов'язані матеріали

a16z: AI's 'Amnesia', Can Continuous Learning Cure It?

The article "a16z: AI's 'Amnesia' – Can Continual Learning Cure It?" explores the limitations of current large language models (LLMs), which, like the protagonist in the film *Memento*, are trapped in a perpetual present—unable to form new memories after training. While methods like in-context learning (ICL), retrieval-augmented generation (RAG), and external scaffolding (e.g., chat history, prompts) provide temporary solutions, they fail to enable true internalization of new knowledge. The authors argue that compression—the core of learning during training—is halted at deployment, preventing models from generalizing, discovering novel solutions (e.g., mathematical proofs), or handling adversarial scenarios. The piece introduces *continual learning* as a critical research direction to address this, categorizing approaches into three paths: 1. **Context**: Scaling external memory via longer context windows, multi-agent systems, and smarter retrieval. 2. **Modules**: Using pluggable adapters or external memory layers for specialization without full retraining. 3. **Weights**: Enabling parameter updates through sparse training, test-time training, meta-learning, distillation, and reinforcement learning from feedback. Challenges include catastrophic forgetting, safety risks, and auditability, but overcoming these could unlock models that learn iteratively from experience. The conclusion emphasizes that while context-based methods are effective, true breakthroughs require models to compress new information into weights post-deployment, moving from mere retrieval to genuine learning.

marsbit2 год тому

a16z: AI's 'Amnesia', Can Continuous Learning Cure It?

marsbit2 год тому

Can a Hair Dryer Earn $34,000? Deciphering the Reflexivity Paradox in Prediction Markets

An individual manipulated a weather sensor at Paris Charles de Gaulle Airport with a portable heat source, causing a Polymarket weather market to settle at 22°C and earning $34,000. This incident highlights a fundamental issue in prediction markets: when a market aims to reflect reality, it also incentivizes participants to influence that reality. Prediction markets operate on two layers: platform rules (what outcome counts as a win) and data sources (what actually happened). While most focus on rules, the real vulnerability lies in the data source. If reality is recorded through a specific source, influencing that source directly affects market settlement. The article categorizes markets by their vulnerability: 1. **Single-point physical data sources** (e.g., weather stations): Easily manipulated through physical interference. 2. **Insider information markets** (e.g., MrBeast video details): Insiders like team members use non-public information to trade. Kalshi fined a剪辑师 $20,000 for insider trading. 3. **Actor-manipulated markets** (e.g., Andrew Tate’s tweet counts): The subject of the market can control the outcome. Evidence suggests Tate’sociated accounts coordinated to profit. 4. **Individual-action markets** (e.g., WNBA disruptions): A single person can execute an event to profit from their pre-placed bets. Kalshi and Polymarket handle these issues differently. Kalshi enforces strict KYC, publicly penalizes insider trading, and reports to regulators. Polymarket, with its anonymous wallet-based system, has historically been more permissive, arguing that insider information improves market accuracy. However, it cooperated with authorities in the "Van Dyke case," where a user traded on classified government information. The core paradox is reflexivity: prediction markets are designed to discover truth, but their financial incentives can distort reality. The more valuable a prediction becomes, the more likely participants are to influence the event itself. The market ceases to be a mirror of reality and instead shapes it.

marsbit3 год тому

Can a Hair Dryer Earn $34,000? Deciphering the Reflexivity Paradox in Prediction Markets

marsbit3 год тому

Торгівля

Спот
Ф'ючерси
活动图片