Lagarde Says ECB Needs Tokenised Money, Not Crypto Stablecoins

bitcoinistPublished on 2026-05-08Last updated on 2026-05-08

Abstract

ECB President Christine Lagarde argues Europe should not respond to dominant US dollar stablecoins by creating its own euro-denominated versions. Instead, she advocates for building tokenised financial infrastructure based on central bank money. Lagarde acknowledges stablecoins' growth and role in crypto settlement and cross-border payments, but identifies key risks: financial stability vulnerabilities, as shown by USDC's depegging during the Silicon Valley Bank collapse, and potential disruption to monetary policy transmission in the euro area. She contends that promoting private euro stablecoins under MiCAR would offer limited benefits for the euro's international role while introducing material trade-offs. A more effective path to strengthening the euro involves deeper capital market integration and developing a safe asset base. Lagarde is constructive on the transformative potential of DLT-based market infrastructure for Europe's fragmented financial system. The ECB's strategy focuses on public solutions like the Pontes project for wholesale settlement in central bank money and the Appia roadmap for an interoperable tokenised ecosystem by 2028. The goal is to harness innovation's benefits without replicating fragile instruments from elsewhere.

ECB President Christine Lagarde has pushed back against the idea that Europe should answer dollar crypto stablecoin dominance by promoting euro-denominated stablecoins of its own, arguing instead that the region should build tokenised financial infrastructure anchored in central bank money.

In a speech at the Banco de España LatAm Economic Forum in Roda de Bará, Spain, Lagarde framed stablecoins as one of the fastest-moving policy questions in global finance. The market, she said, has grown from less than $10 billion six years ago to more than $300 billion today, with close to 98% of stablecoins denominated in US dollars and nearly 90% controlled by Tether and Circle.

Lagarde: ECB Must Not Copy US Crypto Stablecoin Model

That concentration has turned crypto stablecoins into more than a crypto-market instrument. In Lagarde’s view, they now sit at the intersection of monetary power, financial stability and tokenised-market infrastructure.

“The growing argument is that to remain relevant, Europe must respond by promoting euro-denominated stablecoins of its own,” Lagarde said. “Otherwise, it faces a future of digital dollaritation and a loss of monetary sovereignty.”

But she argued that this framing misses the central issue. Stablecoins, according to Lagarde, perform two separate functions that are often conflated: a monetary function, by extending the reach of a currency, and a technological function, by acting as the cash leg for settlement on distributed ledger infrastructure.

“The argument I want to develop today is that once we disentangle those two functions, the case for promoting euro-denominated stablecoins is far weaker than it appears,” she said. “And a more fundamental question comes into view: do we actually need stablecoins to obtain the benefits they are said to provide? Or are we mistaking the instrument for the outcome?”

Lagarde acknowledged that stablecoins have become central to crypto settlement and increasingly relevant for cross-border payments, particularly in regions where access to stable currencies is limited. She also noted that dollar-backed stablecoins can reinforce demand for US Treasuries, especially if they become yield-bearing instruments.

That dynamic is now openly part of US policy. Lagarde pointed to the GENIUS Act, which the US administration has described not only as a consumer protection and financial stability measure, but also as a tool to support “the continued global dominance of the U.S. dollar” and strengthen demand for Treasuries.

For Europe, however, Lagarde said the monetary case for euro stablecoins is weak once risks are included. Under MiCAR, euro-denominated stablecoins could create additional demand for euro-area safe assets and marginally extend the euro’s international reach. Yet she argued that the trade-offs would be material.

The first is financial stability. Lagarde cited Circle’s USDC depeg during the Silicon Valley Bank collapse in March 2023, when Circle disclosed that $3.3 billion of USDC reserves were held at the failed bank and the token briefly fell to $0.877.
“The promise of par redemption depends on the very market confidence that can vanish when financial stability deteriorates,” she said. “And a mass redemption can accelerate that deterioration.”

The second risk is monetary policy transmission. If retail deposits migrate into non-bank stablecoins and return to banks as wholesale funding, the ECB’s rate decisions may transmit less effectively through the banking system. Lagarde said this matters particularly in the euro area, where banks remain the dominant source of credit to the real economy.

Her conclusion was blunt: stablecoins are not an efficient way to strengthen the euro’s international role. The better route, she said, is deeper capital-market integration through Europe’s savings and investments union, alongside a safe asset base that matches the euro’s global ambitions.

Where Lagarde was more constructive was on tokenisation itself. She described DLT-based market infrastructure as genuinely transformative, especially for Europe’s fragmented financial system. In 2023, the EU had 295 trading venues, 14 central clearing counterparties and 32 central securities depositories, compared with two clearing houses and one central securities depository in the US.

Stablecoins currently fill the settlement gap in tokenized markets because they provide an on-chain unit of value for atomic settlement. But Lagarde argued that private stablecoins are fragile and fragmented foundations for that role.

The ECB’s answer is public infrastructure. From September, the Eurosystem plans to offer wholesale settlement through the Pontes project, linking DLT platforms to TARGET so transactions can settle in central bank money. Lagarde also pointed to the Appia roadmap, published in March, which aims to support a fully interoperable European tokenised financial ecosystem by 2028.

“Europe knows which port it is sailing to,” Lagarde concluded. “Our task is not to replicate instruments developed elsewhere, but to build the foundations and the infrastructure that serve our own objectives, so that we can harness the benefits of innovation without importing the fragilities.”

At press time, the total crypto market cap stood at $2.64 trillion.

Total crypto market cap faces the 20-month EMA, 1-week chart | Source: TOTAL on TradingView.com

Related Questions

QWhat is Christine Lagarde's main argument against promoting euro-denominated stablecoins?

AShe argues that the monetary case for euro stablecoins is weak once financial stability and monetary policy transmission risks are considered, and that Europe should instead build tokenized financial infrastructure anchored in central bank money.

QAccording to Lagarde, what two functions do stablecoins perform that are often conflated?

AStablecoins perform a monetary function, by extending the reach of a currency, and a technological function, by acting as the cash leg for settlement on distributed ledger infrastructure.

QWhat are the two main risks Lagarde associates with private stablecoins?

AThe two main risks are financial stability (as demonstrated by the USDC depeg during the Silicon Valley Bank collapse) and the impairment of monetary policy transmission if retail deposits migrate into stablecoins.

QWhat European public infrastructure projects does Lagarde mention as alternatives to stablecoins for tokenized markets?

AShe mentions the Pontes project (starting in September) to offer wholesale settlement in central bank money, and the Appia roadmap, which aims to support a fully interoperable European tokenized financial ecosystem by 2028.

QHow does Lagarde describe the current dominance of US dollar stablecoins in the market?

AShe states that close to 98% of stablecoins are denominated in US dollars, with nearly 90% of the market controlled by Tether and Circle.

Related Reads

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

A US researcher's visit to China's top AI labs reveals distinct cultural and organizational factors driving China's rapid AI development. While talent, data, and compute are similar to the West, Chinese labs excel through a pragmatic, execution-focused culture: less emphasis on individual stardom and conceptual debate, and more on teamwork, engineering optimization, and mastering the full tech stack. A key advantage is the integration of young students and researchers who approach model-building with fresh perspectives and low ego, prioritizing collective progress over personal credit. This contrasts with the US culture of self-promotion and "star scientist" narratives. Chinese labs also exhibit a strong "build, don't buy" mentality, preferring to develop core capabilities—like data pipelines and environments—in-house rather than relying on external services. The ecosystem feels more collaborative than tribal, with mutual respect among labs. While government support exists, its scale is unclear, and technical decisions appear driven by labs, not state mandates. Chinese companies across sectors, from platforms to consumer tech, are building their own foundational models to control their tech destiny, reflecting a broader cultural drive for technological sovereignty. Demand for AI is emerging, with spending patterns potentially mirroring cloud infrastructure more than traditional SaaS. Despite challenges like a less mature data industry and GPU shortages, Chinese labs are propelled by vast talent, rapid iteration, and deep integration with the open-source community. The competition is evolving beyond a pure model race into a contest of organizational execution, developer ecosystems, and industrial pragmatism.

marsbit5m ago

Why is China's AI Developing So Fast? The Answer Lies Inside the Labs

marsbit5m ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

Corning, a 175-year-old glass company, is experiencing a dramatic revival as a key player in AI infrastructure, driven by surging demand for high-performance optical fiber in data centers. AI data centers require vastly more fiber than traditional ones—5 to 10 times as much per rack—to handle high-speed data transmission between GPUs. This structural demand shift, coupled with supply constraints from the lengthy expansion cycle for fiber preforms, has created a significant supply-demand gap. Nvidia has invested in Corning, along with Lumentum and Coherent, in a $4.5 billion total commitment to secure the optical supply chain for AI. Corning's competitive edge lies in its expertise in producing ultra-low-loss, high-density, and bend-resistant specialty fiber, which is critical for 800G+ and future 1.6T data rates. Its deep involvement in co-packaged optics (CPO) with partners like Nvidia further solidifies its position. While not the largest fiber manufacturer globally, Corning's revenue from enterprise/data center clients now exceeds 40% of its optical communications sales, and it has secured multi-year supply agreements with major hyperscalers including Meta and Nvidia. Financially, Corning's optical communications revenue has surged, doubling from $1.3 billion in 2023 to over $3 billion in 2025. Its stock price has risen nearly 6-fold since late 2023. Key future catalysts include the rollout of Nvidia's CPO products and the scale of undisclosed customer agreements. However, risks include high current valuations and potential disruption from next-generation technologies like hollow-core fiber. The company's long-term bet on light over electricity, maintained even through the telecom bubble crash, is now being validated by the AI boom.

marsbit37m ago

3 Years, 5 Times: The Rebirth of a Century-Old Glass Factory

marsbit37m ago

In the Age of AI, the Organization Itself Is the Moat

In the AI era, where products, interfaces, and narratives are easily replicated, a company's true moat is its organizational structure. The article argues that exceptional companies like OpenAI, Anthropic, and Palantir differentiate themselves not merely through technology but by inventing new organizational forms that allow a specific type of talent to thrive and become a version of themselves they couldn't elsewhere. These companies compete on identity, offering ambitious individuals a sense of being special, chosen, close to power, and part of a historic mission. However, this emotional commitment must be matched by structural commitment—real power, ownership, status, and economic participation. For founders, the key question is not how to tell a better story, but what kind of person can only truly realize their potential within their specific company structure. For individuals evaluating opportunities, the distinction between "being chosen" (an emotional feeling) and "being seen" (a structural reality of tangible power and rewards) is crucial. The most dangerous promises are those priced in future time. While AI makes copying visible elements easy, it does not make building a great, novel organization any easier. The next frontier of competition is creating organizational vessels that attract, structure, and compound the judgment of the right people—those whom traditional boxes cannot contain. The company itself becomes the moat.

marsbit1h ago

In the Age of AI, the Organization Itself Is the Moat

marsbit1h ago

Turing Award Laureate Sutton's New Work: Using a Formula from 1967 to Solve a Major Flaw in Streaming Reinforcement Learning

New research titled "Intentional Updates for Streaming Reinforcement Learning" (arXiv:2604.19033v1), involving Turing Award laureate Richard Sutton, addresses a core challenge in deep reinforcement learning (RL): the "stream barrier." Current deep RL methods typically rely on replay buffers and batch training for stability, failing catastrophically when learning online from single data points (streaming). The authors propose a fundamental shift: instead of prescribing how far to move parameters (a fixed step size), their "Intentional Updates" method specifies the desired change in the function's output (e.g., a 5% reduction in value prediction error). It then calculates the step size needed to achieve that intent. This idea is inspired by the Normalized Least Mean Squares (NLMS) algorithm from 1967. Applied to value and policy learning, this yields algorithms like Intentional TD(λ) and Intentional AC. The method inherently stabilizes learning by adapting the step size based on the local gradient landscape, preventing overshooting/undershooting. In experiments on MuJoCo continuous control and Atari discrete tasks, Intentional AC achieved performance rivaling batch-based algorithms like SAC in a streaming setting (batch size=1, no replay buffer), while being ~140x more computationally efficient per update. The work demonstrates significant robustness, reducing reliance on numerous stabilization tricks. A remaining challenge is bias in policy updates due to action-dependent step sizes. Overall, this approach advances efficient, online, "learn-as-you-go" RL, enabling adaptive systems without massive data buffers or compute clusters.

marsbit1h ago

Turing Award Laureate Sutton's New Work: Using a Formula from 1967 to Solve a Major Flaw in Streaming Reinforcement Learning

marsbit1h ago

Trading

Spot
Futures
活动图片