Blockchain trial on Canton Network tests collateral reuse with tokenized US Treasurys

cointelegraphPublicado a 2025-12-09Actualizado a 2025-12-09

Resumen

Digital Asset and a group of major financial institutions have successfully conducted a second round of onchain US Treasury financing trials on the Canton Network. The test introduced real-time collateral reuse using tokenized US Treasurys and expanded the use of multiple stablecoins, including USDC, to enhance onchain liquidity. This innovation bypasses traditional operational delays associated with rehypothecation. Participants included Bank of America, Citadel Securities, Cumberland DRW, Virtu Financial, Société Générale, Tradeweb, Circle, Brale, and M1X Global. The Canton Network, a blockchain built for institutional finance, continues to grow in the tokenized real-world assets (RWA) space, with over $370 billion represented onchain, leading the market significantly.

Digital Asset and a group of financial institutions have completed a second round of onchain US Treasury financing on the Canton Network, introducing real-time collateral reuse and expanding the number of stablecoins involved.

Five transactions were executed in the newest phase, building on the July pilot, which first demonstrated that US Treasurys and the USDC (USDC) stablecoin could be combined to finance and settle transactions on the blockchain.

In the latest trial, the companies used multiple stablecoins to finance positions against tokenized US Treasurys, widening the pool of onchain liquidity available for financing transactions.

The trial showed that tokenized US Treasurys could be passed between counterparties and reused as collateral in real-time, sidestepping the operational delays that typically accompany rehypothecation in traditional finance.

The effort brought together Bank of America, Citadel Securities, Cumberland DRW, Virtu Financial, Société Générale, Tradeweb, Circle, Brale and M1X Global, which are all a part of the Canton Network’s Industry Working Group.

Kelly Mathieson, chief business development officer at Digital Asset — the company behind the Canton Network — said in a statement that the test was “part of a thoughtful progression toward a new market model.”

Justin Peterson, chief technology officer of Tradeweb, added that “demonstrating real-time collateral reuse and expanded stablecoin liquidity isn’t just a technical achievement — it’s a blueprint for what the future of institutional finance can look like.”

Related: ‘We refused to do an ICO’: The truth behind Canton’s tokenomics

Canton Network expands footprint in tokenized RWAs

The Canton Network, a layer-1 blockchain built for institutional finance, has been expanding its presence across the tokenization sector this year.

On Dec. 4, its developer Digital Asset secured roughly $50 million in strategic backing from BNY, iCapital, Nasdaq and S&P Global. The new funding followed a $135 million raise earlier this year and is intended to support the network’s scaling efforts.

In October, asset manager Franklin Templeton said it would migrate its Benji Investments platform — which tokenizes shares of the firm’s flagship US money market fund — to the Canton Network.

Data from RWA.xyz also shows the Canton Network now leads the market for tokenized real-world assets by a wide margin, with more than $370 billion represented onchain, far outpacing popular networks such as Ethereum, Polygon, Solana and other public chains.

Top blockchains for RWA. Source: RWA.xyz

Magazine: 6 reasons Jack Dorsey is definitely Satoshi... and 5 reasons he’s not

Lecturas Relacionadas

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

In a span of four days, Amazon announced an additional $25 billion investment, and Google pledged up to $40 billion—both direct competitors pouring over $65 billion into the same AI startup, Anthropic. Rather than a typical venture capital move, this signals the latest escalation in the cloud wars. The core of the deal is not equity but compute pre-orders: Anthropic must spend the majority of these funds on AWS and Google Cloud services and chips, effectively locking in massive future compute consumption. This reflects a shift in cloud market dynamics—enterprises now choose cloud providers based on which hosts the best AI models, not just price or stability. With OpenAI deeply tied to Microsoft, Anthropic’s Claude has become the only viable strategic asset for Google and Amazon to remain competitive. Anthropic’s annualized revenue has surged to $30 billion, and it is expanding into verticals like biotech, positioning itself as a cross-industry AI infrastructure layer. However, this funding comes with constraints: Anthropic’s independence is challenged as it balances two rival investors, its safety-first narrative faces pressure from regulatory scrutiny, and its path to IPO introduces new financial pressures. Globally, this accelerates a "tri-polar" closed-loop structure in AI infrastructure, with Microsoft-OpenAI, Google-Anthropic, and Amazon-Anthropic forming exclusive model-cloud alliances. In contrast, China’s landscape differs—investments like Alibaba and Tencent backing open-source model firm DeepSeek reflect a more decoupled approach, though closed-source models from major cloud providers still dominate. The $65 billion bet is ultimately about securing a seat at the table in an AI-defined future—where missing the model layer means losing the cloud war.

marsbitHace 1 hora(s)

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

marsbitHace 1 hora(s)

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

DeepSeek-V4 has been released as a preview open-source model, featuring 1 million tokens of context length as a baseline capability—previously a premium feature locked behind enterprise paywalls by major overseas AI firms. The official announcement, however, openly acknowledges computational constraints, particularly limited service throughput for the high-end DeepSeek-V4-Pro version due to restricted high-end computing power. Rather than competing on pure scale, DeepSeek adopts a pragmatic approach that balances algorithmic innovation with hardware realities in China’s AI ecosystem. The V4-Pro model uses a highly sparse architecture with 1.6T total parameters but only activates 49B during inference. It performs strongly in agentic coding, knowledge-intensive tasks, and STEM reasoning, competing closely with top-tier closed models like Gemini Pro 3.1 and Claude Opus 4.6 in certain scenarios. A key strategic product is the Flash edition, with 284B total parameters but only 13B activated—making it cost-effective and accessible for mid- and low-tier hardware, including domestic AI chips from Huawei (Ascend), Cambricon, and Hygon. This design supports broader adoption across developers and SMEs while stimulating China's domestic semiconductor ecosystem. Despite facing talent outflow and intense competition in user traffic—with rivals like Doubao and Qianwen leading in monthly active users—DeepSeek has maintained technical momentum. The release also comes amid reports of a new funding round targeting a valuation exceeding $10 billion, potentially setting a new record in China’s LLM sector. Ultimately, DeepSeek-V4 represents a shift toward open yet realistic infrastructure development in the constrained compute landscape of Chinese AI, emphasizing engineering efficiency and domestic hardware compatibility over pure model scale.

marsbitHace 1 hora(s)

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

marsbitHace 1 hora(s)

Trading

Spot
Futuros
活动图片