Tokenization arrives onchain for institutions — AMA recap with Redbelly Network

cointelegraphPublished on 2025-12-18Last updated on 2025-12-18

Abstract

Sponsored content: During a Cointelegraph AMA, Redbelly Network and AMAL Trustees discussed how tokenization and deterministic finality can modernize institutional asset workflows, moving beyond manual processes like spreadsheets and delayed reconciliations. The conversation highlighted the shift from experimental pilots to production-grade tokenization infrastructure, particularly in regulated markets and private credit. A key focus was Project Acacia, a Reserve Bank of Australia CBDC pilot, where a smart asset-backed security demonstrated streamlined issuance, servicing, and secondary trading on a single source of truth. Redbelly's deterministic consensus ensures irreversible settlement and high throughput, while its zkIdentity system enables compliance checks without exposing private data. The partnership aims to upgrade asset lifecycle management by combining Redbelly's technical infrastructure with AMAL's fiduciary expertise.

Sponsored Content

Routing trillions in assets through spreadsheets and monthly reconciliations creates delays, blind spots and operational risk. During a recent Cointelegraph AMA, Alan Burt, executive chairman of Redbelly Network, and Luke Andersen, chief product officer at AMAL Trustees, outlined how tokenization and deterministic finality can streamline these workflows without breaking existing fiduciary responsibilities.

“It’s 2025 — institutions should not be relying on email-based confirmations and delayed reconciliations when assets and cash move,” Andersen said, stressing that infrastructure must now match the scale and speed of capital markets.

From experiments to an endorsed infrastructure

The discussion began with an overview of the state of institutional tokenization. Over the past two years, administrators, asset servicers and central banks have moved from lab pilots to production-grade initiatives. For Redbelly, the focus is on regulated markets and private credit, where much of the infrastructure still runs on siloed databases and manual processes.

Burt explained that Redbelly set out to bring “all the lovely things we like in permissionless DeFi” to regulated environments, with an identity and custody layer that enables collateral mobility under existing rules. He pointed to private credit and alternative assets as a starting point, where tokenization can structure workflows and prepare assets for broader distribution over time.

Andersen added that for AMAL and IQ-EQ, tokenization has shifted from an innovation theme to a board-level agenda: “Tokenization is no longer treated as a moonshot experiment, but a strategic infrastructure upgrade that C-suite and board committees are actively exploring.”

Project Acacia: smart ABS and CBDC settlement on a public chain

A central part of the AMA focused on Project Acacia, the Reserve Bank of Australia’s CBDC pilot, where Redbelly and AMAL/IQ-EQ implemented a smart asset-backed security (ABS). The goal was to show how tokenized assets and wholesale CBDC can simplify issuance, servicing, reporting and secondary trading.

Burt walked through the current flow in securitization: an originator runs a loan book, a bank provides warehouse funding and a trustee then manages payments to multiple investor tranches. Each party maintains separate systems and reconciles data every month. In Project Acacia, the underlying loans, the ABS structure and secondary trading all moved to a shared infrastructure.

“What this allowed us to do is connect the originator, the warehouse and the trustee on a single source of truth, and then add secondary trading on the Australian Bond Exchange,” Burt said. “You know exactly what you’re holding and can see through to the underlying assets before you price or impair it.”

Andersen described the pilot as a proof point for the trustee’s role in tokenized markets, showing how fiduciary oversight and digital execution can coexist. Securitization still relies on trust law and investor protection, but tokenization compresses timelines, reduces manual breakpoints and replaces document-driven flows with embedded rules, eligibility checks and deterministic settlement.

Deterministic finality, zkIdentity and a shared control layer

The AMA then turned to infrastructure requirements and risk management. For institutions, throughput alone is not enough; they need predictable costs, audit-ready accountability and guarantees that every transaction resolves in a single, irreversible state.

Burt explained Redbelly’s deterministic consensus and fixed gas model, which are designed for capital markets workloads. All validator nodes propose transactions to each other and agree on a combined “super block” before it is committed, which removes forks and reorganizes risk. This approach, developed through research at the University of Sydney and the CSIRO, Australia’s national research lab and backed by a patent, has been benchmarked at over 97,000 transactions per second with zero transaction loss under stress tests.

“We believe settlement is the core competency of the network,” Burt said. “Institutions need to know that once a block is created, there are no rollbacks, and that each transaction executes in the right sequence.”

Both speakers highlighted Redbelly’s zkIdentity system as another key piece. Rather than duplicating KYC and eligibility checks at every venue, users receive verifiable credentials that prove, in zero-knowledge, that they meet requirements for a given product or jurisdiction. Eligibility is checked at the network layer, while underlying data remains private and issuers still operate within their licences.

For AMAL/IQ-EQ, this addresses a structural compliance problem. Andersen noted: “zkIdentity lets us verify eligibility at the network layer without exposing private data, which enables regulated markets to operate on public rails while keeping controls and safeguards in place.”

The partnership between Redbelly and AMAL/IQ-EQ combines this technical foundation with existing licences, balance sheet strength and established transaction flows. AMAL Trustees brings the fiduciary role, legal enforceability and reporting obligations; Redbelly provides the shared ledger, deterministic settlement and identity layer. Together, they position tokenization as an upgrade to the way asset lifecycles are managed and funded.

Related Questions

QWhat is the main problem that tokenization and deterministic finality aim to solve for institutional asset management?

AThey aim to solve the problems of delays, blind spots, and operational risk caused by routing trillions in assets through spreadsheets, email-based confirmations, and manual monthly reconciliations.

QAccording to the AMA, how has the perception of tokenization changed at institutions like AMAL and IQ-EQ?

ATokenization has shifted from being treated as a moonshot experiment to a strategic infrastructure upgrade that is now a board-level agenda, with C-suite and board committees actively exploring it.

QWhat was the goal of Project Acacia, the pilot project with the Reserve Bank of Australia?

AThe goal was to demonstrate how tokenized assets and a wholesale CBDC can simplify the issuance, servicing, reporting, and secondary trading of a smart asset-backed security (ABS) on a shared infrastructure.

QWhat are two key technical features of the Redbelly Network that make it suitable for institutional use?

ATwo key features are its deterministic consensus mechanism, which provides irreversible settlement and removes fork risk, and its zkIdentity system, which allows for verifiable eligibility checks without exposing private user data.

QHow does the partnership between Redbelly Network and AMAL/IQ-EQ combine their respective strengths?

AThe partnership combines Redbelly's technical foundation (shared ledger, deterministic settlement, identity layer) with AMAL/IQ-EQ's existing licenses, balance sheet strength, established transaction flows, fiduciary role, legal enforceability, and reporting obligations.

Related Reads

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

In recent months, the rapid growth of the AI industry has attracted significant talent from the crypto sector. A persistent question among researchers intersecting both fields is whether blockchain can become a foundational part of AI infrastructure. While many previous AI and Crypto projects focused on application layers (like AI Agents, on-chain reasoning, data markets, and compute rentals), few achieved viable commercial models. Gensyn differentiates itself by targeting the most critical and expensive layer of AI: model training. Gensyn aims to organize globally distributed GPU resources into an open AI training network. Developers can submit training tasks, nodes provide computational power, and the network verifies results while distributing incentives. The core issue addressed is not decentralization for its own sake, but the increasing centralization of compute power among tech giants. In the era of large models, access to GPUs (like the H100) has become a decisive bottleneck, dictating the pace of AI development. Major AI companies are heavily dependent on large cloud providers for compute resources. Gensyn's approach is significant for several reasons: 1) It operates at the core infrastructure layer (model training), the most resource-intensive and technically demanding part of the AI value chain. 2) It proposes a more open, collaborative model for compute, potentially increasing resource utilization by dynamically pooling idle GPUs, similar to early cloud computing logic. 3) Its technical moat lies in solving complex challenges like verifying training results, ensuring node honesty, and maintaining reliability in a distributed environment—making it more of a deep-tech infrastructure company. 4) It targets a validated, high-growth market with genuine demand, rather than pursuing blockchain integration without purpose. Ultimately, the boundaries between Crypto and AI are blurring. AI requires global resource coordination, incentive mechanisms, and collaborative systems—areas where crypto-native solutions excel. Gensyn represents a step toward making advanced training capabilities more accessible and collaborative, moving beyond a niche controlled by a few giants. If successful, it could evolve into a fundamental piece of AI infrastructure, where the most enduring value in the AI era is often created.

marsbit6h ago

Gensyn AI: Don't Let AI Repeat the Mistakes of the Internet

marsbit6h ago

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

A US researcher's visit to China's top AI labs reveals distinct cultural and organizational factors driving China's rapid AI development. While talent, data, and compute are similar to the West, Chinese labs excel through a pragmatic, execution-focused culture: less emphasis on individual stardom and conceptual debate, and more on teamwork, engineering optimization, and mastering the full tech stack. A key advantage is the integration of young students and researchers who approach model-building with fresh perspectives and low ego, prioritizing collective progress over personal credit. This contrasts with the US culture of self-promotion and "star scientist" narratives. Chinese labs also exhibit a strong "build, don't buy" mentality, preferring to develop core capabilities—like data pipelines and environments—in-house rather than relying on external services. The ecosystem feels more collaborative than tribal, with mutual respect among labs. While government support exists, its scale is unclear, and technical decisions appear driven by labs, not state mandates. Chinese companies across sectors, from platforms to consumer tech, are building their own foundational models to control their tech destiny, reflecting a broader cultural drive for technological sovereignty. Demand for AI is emerging, with spending patterns potentially mirroring cloud infrastructure more than traditional SaaS. Despite challenges like a less mature data industry and GPU shortages, Chinese labs are propelled by vast talent, rapid iteration, and deep integration with the open-source community. The competition is evolving beyond a pure model race into a contest of organizational execution, developer ecosystems, and industrial pragmatism.

marsbit7h ago

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

marsbit7h ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

Corning, a 175-year-old glass company, is experiencing a dramatic revival as a key player in AI infrastructure, driven by surging demand for high-performance optical fiber in data centers. AI data centers require vastly more fiber than traditional ones—5 to 10 times as much per rack—to handle high-speed data transmission between GPUs. This structural demand shift, coupled with supply constraints from the lengthy expansion cycle for fiber preforms, has created a significant supply-demand gap. Nvidia has invested in Corning, along with Lumentum and Coherent, in a $4.5 billion total commitment to secure the optical supply chain for AI. Corning's competitive edge lies in its expertise in producing ultra-low-loss, high-density, and bend-resistant specialty fiber, which is critical for 800G+ and future 1.6T data rates. Its deep involvement in co-packaged optics (CPO) with partners like Nvidia further solidifies its position. While not the largest fiber manufacturer globally, Corning's revenue from enterprise/data center clients now exceeds 40% of its optical communications sales, and it has secured multi-year supply agreements with major hyperscalers including Meta and Nvidia. Financially, Corning's optical communications revenue has surged, doubling from $1.3 billion in 2023 to over $3 billion in 2025. Its stock price has risen nearly 6-fold since late 2023. Key future catalysts include the rollout of Nvidia's CPO products and the scale of undisclosed customer agreements. However, risks include high current valuations and potential disruption from next-generation technologies like hollow-core fiber. The company's long-term bet on light over electricity, maintained even through the telecom bubble crash, is now being validated by the AI boom.

marsbit8h ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

marsbit8h ago

Trading

Spot
Futures
活动图片