Coinbase CEO calls tokenized stocks ‘inevitable’ amid CLARITY Act uncertainty

ambcryptoPublished on 2026-01-18Last updated on 2026-01-18

Abstract

Coinbase CEO Brian Armstrong remains highly optimistic about tokenized stocks, calling them "inevitable" due to their potential to be faster, cheaper, and more global, despite recent regulatory uncertainty surrounding the CLARITY Act. The tokenized stock market has grown rapidly to $867 million, nearing $1 billion, with projections suggesting it could reach trillions by 2030 under clear regulation. A Bitwise survey indicated strong institutional interest, with stablecoins and tokenization being the top focus among financial advisors. However, the industry is divided on proposed crypto legislation, with Coinbase withdrawing support over concerns that the Senate bill bans tokenized stocks, while others like Robinhood downplayed these issues. BNB Chain has recently overtaken Solana as the leading settlement layer for tokenized equities.

In less than a year, the tokenized stock market has risen from zero to nearly $1 billion and could explode if regulatory clarity is established for the sector.

Despite the recent legislative hiccups with the CLARITY Act, the Coinbase CEO has remained bullish on on-chain stocks. On X (formerly Twitter), he said tokenized stocks will be huge and added,

“It’s inevitable – faster, cheaper, more global”

Sizing tokenized markets

Tokenized equities and ETFs are the on-chain version of traditional shares. Most projections for the sector range from a few trillion dollars to tens of trillions by 2030.

For McKinsey, the market could reach $3.8 trillion in an accelerated adoption scenario with clear and permitted regulation.

In other words, the massive potential is undeniable. In fact, a recent survey by asset manager Bitwise found that stablecoins and tokenization had the highest interest among the financial advisors interviewed.

“Stablecoins and tokenization attracted the most interest (30%), followed by “digital gold”/fiat debasement (22%) and crypto-linked AI investments (19%).”

This was a telltale sign of the immense potential and institutional interest in tokenization. Commenting on the survey, Bitwise CIO Matt Hougan said,

“Crypto’s future has always depended on what financial advisors think of it.”

Tokenization rules split industry

However, the future growth hinges on clear rules for issuers. But the industry appears divided over the provisions of the Senate’s crypto market structure bill regarding tokenized securities.

For Coinbase, the Senate draft banned tokenized stocks and stablecoin rewards, forcing the exchange to withdraw its support earlier in the week.

But other leaders, such as Robinhood’s chief legal officer and former SEC commissioner Dan Gallagher, downplayed the concerns as “overblown.”

“Concerns about tokenization in the Senate bill are overblown, but we’ll work with Congress to address any lingering uncertainty.”

It remains to be seen whether a deal will be reached to reignite the bill’s momentum and usher in the tokenization boom.

Meanwhile, the tokenized stock market has reached $867 million and is inching closer to $1 billion. Notably, the sector saw an 11% surge in Monthly Transfer Volume to $2.3 billion while holders increased by 22% to 159,000.

This was indicative of accelerated early adoption and appetite for tokenized stocks.

At the chain settlement level, Solana has been leading traction since last July, but BNB Chain flipped it and has maintained the lead over the past two months.


Final Thoughts

  • Coinbase CEO Brian Armstrong was bullish on tokenized stocks despite regulatory uncertainty.
  • BNB Chain flipped Solana as the top settlement layer for tokenized stocks and ETFs.

Related Questions

QWhat is the Coinbase CEO's view on tokenized stocks despite the uncertainty surrounding the CLARITY Act?

ACoinbase CEO Brian Armstrong remains bullish on tokenized stocks, calling them 'inevitable' and describing them as 'faster, cheaper, more global'.

QWhat is the projected market size for tokenized equities and ETFs by 2030 according to McKinsey?

AMcKinsey projects that the tokenized market could reach $3.8 trillion in an accelerated adoption scenario with clear and permitted regulation.

QAccording to a Bitwise survey, which crypto-related topic attracted the most interest from financial advisors?

AStablecoins and tokenization attracted the most interest at 30%, followed by 'digital gold'/fiat debasement (22%) and crypto-linked AI investments (19%).

QWhy did Coinbase withdraw its support from the Senate's crypto market structure bill?

ACoinbase withdrew its support because the Senate draft was interpreted as banning tokenized stocks and stablecoin rewards.

QWhich blockchain has recently become the top settlement layer for tokenized stocks and ETFs?

ABNB Chain flipped Solana and has maintained the lead as the top settlement layer for tokenized stocks and ETFs over the past two months.

Related Reads

OpenAI Post-Training Engineer Weng Jiayi Proposes a New Paradigm Hypothesis for Agentic AI

OpenAI engineer Weng Jiayi's "Heuristic Learning" experiments propose a new paradigm for Agentic AI, suggesting that intelligent agents can improve not just by training neural networks, but also by autonomously writing and refining code based on environmental feedback. In the experiment, a coding agent (powered by Codex) was tasked with developing and maintaining a programmatic strategy for the Atari game Breakout. Starting from a basic prompt, the agent iteratively wrote code, ran the game, analyzed logs and video replays to identify failures, and then modified the code. Through this engineering loop of "code-run-debug-update," it evolved a pure Python heuristic strategy that achieved a perfect score of 864 in Breakout and performed competitively with deep reinforcement learning (RL) algorithms in MuJoCo control tasks like Ant and HalfCheetah. This approach, termed Heuristic Learning (HL), contrasts with Deep RL. In HL, experience is captured in readable, modifiable code, tests, logs, and configurations—a software system—rather than being encoded solely into opaque neural network weights. This offers potential advantages in explainability, auditability for safety-critical applications, easier integration of regression tests to combat catastrophic forgetting, and more efficient sample use in early learning stages, as demonstrated in broader tests on 57 Atari games. However, the blog acknowledges clear limitations. Programmatic strategies struggle with tasks requiring long-horizon planning or complex perception (e.g., Montezuma's Revenge), areas where neural networks excel. The future vision is a hybrid architecture: specialized neural networks for fast perception (System 1), HL systems for rules, safety, and local recovery (also System 1), and LLM agents providing high-level feedback and learning from the HL system's data (System 2). The core proposition is that in the era of capable coding agents, a significant portion of an AI's learned experience could be maintained as an auditable, evolving software system.

marsbit27m ago

OpenAI Post-Training Engineer Weng Jiayi Proposes a New Paradigm Hypothesis for Agentic AI

marsbit27m ago

Your Claude Will Dream Tonight, Don't Disturb It

This article explores the recent phenomenon of AI companies increasingly using anthropomorphic language—like "thinking," "memory," "hallucination," and now "dreaming"—to describe machine learning processes. Focusing on Anthropic's newly announced "Dreaming" feature for its Claude Agent platform, the piece explains that this function is essentially an automated, offline batch processing of an agent's operational logs. It analyzes past task sessions to identify patterns, optimize future actions, and consolidate learnings into a persistent memory system, akin to a form of reinforcement learning and self-correction. The article draws parallels to similar features in other AI agent systems like Hermes Agent and OpenClaw, which also implement mechanisms for reviewing historical data, extracting reusable "skills," and strengthening long-term memory. It notes a key difference from human dreaming: these AI "dreams" still consume computational resources and user tokens. Further context is provided by discussing the technical challenges of managing AI "memory" or context, highlighting the computational expense of large context windows and innovations like Subquadratic's new model claiming drastically longer contexts. The core critique argues that this strategic use of human-centric vocabulary does more than market products; it subtly reshapes user perception. By framing algorithms with terms associated with consciousness, companies blur the line between tool and autonomous entity. This linguistic shift can influence user expectations, tolerance for errors, and even perceptions of responsibility when systems fail, potentially diverting scrutiny from the companies and engineers behind the technology. The article concludes by speculating that terms like "daydreaming" for predictive task simulation might be next, continuing this trend of embedding the idea of an "inner life" into computational processes.

marsbit29m ago

Your Claude Will Dream Tonight, Don't Disturb It

marsbit29m ago

Trading

Spot
Futures
活动图片