On Stablecoins, Real-World Asset Tokenization, Payments, and Finance
Better, Smarter Stablecoin On/Off Ramps
Last year, stablecoin transaction volume was estimated to reach $46 trillion, repeatedly hitting new all-time highs. To put that in perspective, that's more than 20 times the transaction volume of PayPal; close to 3 times the transaction volume of Visa (one of the world's largest payment networks); and rapidly approaching the transaction volume of ACH (the electronic network used for financial transactions like direct deposits in the US).
Today, you can send a stablecoin in less than a second for a fee of less than a cent. However, the unsolved problem is how to connect these digital dollars to the financial rails people actually use in their daily lives—in other words, the on/off ramps for stablecoins.
A new generation of startups is filling this gap, connecting stablecoins to more familiar payment systems and local currencies. Some use cryptographic proofs to allow people to privately exchange local balances for digital dollars. Others integrate with regional networks that utilize QR codes, real-time payment rails, and other features to enable interbank payments... while others are building a more truly interoperable global wallet layer and card issuance platform, allowing users to spend stablecoins at everyday merchants. In short, these methods broaden the range of people who can participate in the digital dollar economy—and could accelerate the use of stablecoins more directly for mainstream payments.
As these on/off ramps mature, and digital dollars gain direct access to local payment systems and merchant tools, new behaviors will emerge. Cross-border workers can get paid in real-time. Merchants can receive global dollars without a bank account. Applications can settle value instantly with users anywhere. Stablecoins will fundamentally transform from a niche financial instrument into the internet's foundational settlement layer.
~Jeremy Zhang, a16z crypto engineering team
Thinking More Crypto-Natively About Real-World Asset Tokenization and Stablecoins
We see strong interest from banks, fintech companies, and asset managers in putting US equities, commodities, indices, and other traditional assets on-chain. As more traditional assets come on-chain, tokenization is often skeuomorphic—rooted in current real-world asset concepts and not leveraging crypto-native properties.
But synthetic representations like perpetual futures allow for deeper liquidity and are often easier to implement. Perpetual futures also offer easily understood leverage, so I believe they are the crypto-native derivative with the strongest product-market fit. I also believe that emerging market equities are one of the asset classes most suited for perpetualization. (The zero-day options market for certain stocks often trades with deeper liquidity than the spot market, which would be an interesting experiment for perpetualization.)
It all boils down to the question of "perpetualization vs. tokenization"; but regardless, we should see more crypto-native real-world asset tokenization in the coming year.
Along similar lines, after stablecoins went mainstream in 2025, in 2026, we will see more "native issuance, not just tokenization"; the scale of issued stablecoins continues to grow.
But stablecoins without a strong credit infrastructure look like narrow banks, holding specific, supposedly ultra-safe liquid assets. While narrow banking is a valid product, I don't believe it will be the long-term backbone of the on-chain economy.
We've already seen many new asset managers, custodians, and protocols beginning to facilitate asset-backed loans against off-chain collateral on-chain. These loans often originate off-chain and are then tokenized. I think tokenization offers little benefit here except perhaps for distribution to users already on-chain. This is why debt assets should be issued natively on-chain, not issued off-chain and then tokenized. Native on-chain issuance can reduce loan servicing costs, back-office structuring costs, and improve accessibility. The challenge here will partly be compliance and standardization, but builders are already working on these issues.
~Guy Wuollet, a16z crypto General Partner
Stablecoins Unlock the Bank Ledger Upgrade Cycle—and New Payment Scenarios
The software the average bank runs is unrecognizable to modern developers: in the 1960s and 70s, banks were early adopters of large-scale software systems. The second generation of core banking software began in the 80s and 90s (e.g., via Temenos's GLOBUS and InfoSys's Finacle). But all this software has aged and is upgraded too slowly. As a result, banking—especially the critical core ledger, the key database tracking deposits, collateral, and other debts—still often runs on mainframes, programmed in COBOL, using batch file interfaces instead of APIs.
The vast majority of global assets sit on these equally decades-old core ledgers. While these systems are battle-tested, trusted by regulators, and deeply integrated into complex banking scenarios, they also hinder innovation. Adding a critical feature like real-time payments can take months or more likely years, requiring navigating layers of technical debt and regulatory complexity.
This is where stablecoins come in. Not only have the past few years been when stablecoins found product-market fit and went mainstream, but this year, traditional financial institutions embraced them on a whole new level. Stablecoins, tokenized deposits, tokenized treasuries, and on-chain bonds enable banks, fintechs, and financial institutions to build new products and serve new customers. More importantly, they can do this without forcing these organizations to rewrite their legacy systems—systems that, while aging, have reliably operated for decades. Thus, stablecoins offer a new path to institutional innovation.
~Sam Broner
The Internet Becomes the Bank
As agents emerge at scale, and more commerce happens automatically in the background rather than through user clicks, then the way money—value!—moves needs to change.
In a world where systems act on intent rather than step-by-step instructions—where funds move because an AI agent identifies a need, fulfills an obligation, or triggers an outcome—value must flow as quickly and freely as information does today. This is where blockchains, smart contracts, and new protocols come in.
Smart contracts can already settle a dollar payment globally in seconds. But, in 2026, emerging primitives like x402 will make this settlement programmable and reactive: agents pay each other instantly, permissionlessly for data, GPU time, or API calls—no invoicing, reconciliation, or batching required. Developers ship software updates with built-in payment rules, limits, and audit trails bundled in—no fiat integrations, merchant onboarding, banks required. As events unfold, prediction markets self-settle in real-time—odds update, agents trade, payments clear globally in seconds... no custodians or exchanges needed.
Once value can move this way, "payment flows" cease to be a separate operational layer and become a network behavior: banking becomes part of the internet's foundational plumbing, assets become infrastructure. If money becomes a data packet the internet can route, then the internet doesn't just support the financial system... it becomes the financial system.
~Christian Crowley and Pyrs Carvolth, a16z crypto go-to-market team
Wealth Management for Everyone
Personalized wealth management services were traditionally reserved for high-net-worth clients of banks: providing tailored advice, personalized portfolios across asset classes, is both expensive and operationally complex. But as more asset classes are tokenized, crypto rails enable strategies—personalized via AI recommendations and copilots—to be executed and rebalanced instantly at minimal cost.
This is more than robo-advisors; everyone gets access to active portfolio management, not just passive. In 2025, traditional finance increased portfolio allocation exposure to crypto assets (banks now suggest 2-5% allocation directly or via ETPs), but this is just the beginning; in 2026, we will see platforms built for "wealth accumulation" not just "wealth preservation"—as fintechs (like Revolut and Robinhood) and centralized exchanges (like Coinbase) leverage their tech stack lead to capture more of this market.
Meanwhile, DeFi tools like Morpho Vaults automatically allocate assets to the lending market with the best risk-adjusted yield—providing a core interest-bearing asset allocation for portfolios. Holding surplus liquid balances in stablecoins rather than fiat, in tokenized money market funds rather than traditional ones, opens up possibilities for further yield.
Finally, retail investors now have easier access to less liquid private market assets like private credit, pre-IPO companies, and private equity, as tokenization helps unlock these markets while still maintaining compliance and reporting requirements. As the various components of a balanced portfolio are tokenized (moving along the risk spectrum from bonds to equities to private and alternative assets), they can be automatically rebalanced without operations like wire transfers.
~Maggie Hsu, a16z crypto go-to-market team
On Agents & AI
From Know Your Customer to Know Your Agent
The bottleneck in the agent economy is shifting from intelligence to identity.
In financial services, "non-human identities" now outnumber human employees 96 to 1—yet these identities remain unbanked ghosts. The missing key primitive here is KYA: Know Your Agent.
Just as humans need credit scores to get loans, agents will need cryptographically signed credentials to transact—linking agents to their principals, constraints, and responsibilities. Until this exists, merchants will continue to block agents at the firewall. An industry that spent decades building KYC infrastructure now has mere months to figure out KYA.
~Sean Neville, Circle Co-Founder & USDC Architect; Catena Labs CEO
We Will Use AI for Substantive Research Tasks
As a mathematical economist, in January, getting consumer-grade AI models to understand my workflow was difficult; yet by November, I could give models abstract instructions the same way I guide PhD students... and they sometimes return novel and correctly executed answers. Beyond my experience, we're starting to see AI used more broadly for research—particularly in reasoning, where models now directly in discovery and autonomously solve Putnam problems (perhaps the world's hardest university-level math exam).
Which fields this kind of research assistance will help most and how remains an open question. But I expect AI research will enable and reward a new style of polymathic research: one that favors speculating on relationships between ideas and can quickly infer from more speculative answers. These answers might not be accurate but still point in the right direction (at least under some topology). Ironically, this is a bit like harnessing the power of model hallucination: when models get "smart" enough, giving them abstract space to free-associate might still produce nonsense—but sometimes opens the door to discovery, much like people are most creative when not working in a linear, explicitly stated direction.
Reasoning this way will require a new style of AI workflow—not just agent-to-agent, but more like agents wrapping agents—where layers of models help the researcher evaluate the approaches of earlier models and progressively refine. I've been using this method to write papers, while others use it for patent searches, inventing new forms of art, or (unfortunately) discovering novel smart contract attacks.
However: encapsulating reasoning agent ensembles for research operations will require better interoperability between models and a way to identify and appropriately compensate each model's contribution—two problems crypto can help solve.
~Scott Kominers, a16z crypto research and Harvard Business School professor
The Invisible Tax on the Open Web
The rise of AI agents is imposing an invisible tax on the open web, fundamentally upending its economic foundation. This disruption stems from a growing misalignment between the internet's context layer and its execution layer: currently, AI agents extract data from ad-supported websites (the context layer) to provide user convenience, while systematically bypassing the revenue streams (like ads and subscriptions) that fund the content.
To prevent the erosion of the open web (and preserve the diverse content that nourishes AI itself), we need the deployment of technical and economic solutions at scale. This could include next-generation sponsored content, micro-attribution systems, or other novel funding models. Existing AI licensing deals have also proven to be a financially unsustainable stopgap, often compensating content providers for only a fraction of the revenue they've lost from their traffic being consumed by AI.
The web needs a new techno-economic model where value flows automatically. The key shift in the coming year will be moving from static licensing to real-time, usage-based compensation. This means testing and scaling systems—likely leveraging blockchain-enabled nanopayments and sophisticated attribution standards—to automatically reward every entity that contributes information to an agent successfully completing a task.
~Liz Harkavy, a16z crypto investment team
On Privacy (& Security)
Privacy Will Be Crypto's Most Important Moat
Privacy is the critical feature needed for the world's finance to migrate on-chain. It's also a feature lacking in almost every blockchain that exists today. For most chains, privacy is little more than an afterthought.
But now, privacy alone is enough to differentiate one chain from all others. Privacy also does something even more important: it creates chain lock-in; arguably, a privacy network effect. Especially in a world where performance competition is no longer enough.
Thanks to bridging protocols, moving from one chain to another is trivial as long as everything is public. But once you make things private, that's no longer the case: bridging tokens is easy, bridging secrets is hard. When crossing the boundary between privacy and public chains—or even between two privacy chains—various metadata is leaked, like transaction timing and size correlations, making it easier to track someone.
Compared to the many undifferentiated new chains where fees might be driven to zero by competition (block space has become essentially the same everywhere), blockchains with privacy features can have stronger network effects. The reality is, if a "general purpose" chain doesn't already have a thriving ecosystem, a killer app, or an unfair distribution advantage, there's very little reason for anyone to use it or build on it—let alone be loyal to it.
When users are on public blockchains, they can easily transact with users on other chains—it doesn't matter which chain you join. However, when users are on privacy blockchains, their choice of chain matters much more, because once on a chain, they are less likely to move and risk exposure. This creates a winner-take-most dynamic. And since privacy is essential for most real-world use cases, a handful of privacy chains could capture the majority of the crypto space.
~Ali Yahya, a16z crypto General Partner
The (Near) Future of Messaging Isn't Just Post-Quantum. It's Decentralized.
As the world prepares for quantum computing, many encrypted messaging apps (Apple, Signal, WhatsApp) have led the charge, all doing excellent work. The problem is, every major messaging tool relies on our trust in private servers operated by a single organization. These servers are targets for governments to easily shut down, backdoor, or coerce into handing over private data.
What good is quantum encryption if a country can shut down your server; if a company holds the keys to a private server; even if a company owns a private server? Private servers demand "trust me"—but no private servers means "you don't need to trust me." Communication doesn't need a single company in the middle. Messaging needs open protocols where we don't have to trust anyone.
The way we achieve this is decentralized networks: no private servers. No single app. All open-source code. Top-tier encryption—including resistance to quantum threats. With an open network, there is no single person, company, non-profit, or country that can take away our ability to communicate. Even if a country or company does shut down one app, 500 new versions would appear the next day. Shut down one node, and economic incentives (thanks to technologies like blockchains) would have a new node immediately take its place.
When people own their messages with their keys the same way they own their money, everything changes. Apps may come and go, but people will always control their information and identity; end users can now own their information, even if they don't own the app.
This is bigger than post-quantum capabilities and encryption; it's about ownership and decentralization. Without both, everything we do is just building unbreakable encryption that can still be shut down.
~Shane Mac, XMTP Labs Co-Founder & CEO
"Secrets as a Service"
Behind every model, agent, and automation lies a simple essential: data. But most data pipelines today—data going into or out of models—are opaque, mutable, and unauditable. This is fine for some consumer applications, but many industries and users (like finance and healthcare) require companies to keep sensitive data confidential. It's also a huge barrier for institutions looking to tokenize real-world assets today.
So how do we enable secure, compliant, autonomous, and globally interoperable innovation while preserving privacy? There are many approaches, but I'll focus on data access control: who controls sensitive data? How does it move? Who (or what) can access it?
Without data access control, anyone hoping to keep data confidential today must use a centralized service or build a custom setup—not only time-consuming and expensive, but also prevents traditional financial institutions and others from fully leveraging the features and benefits of on-chain data management. As agent systems begin to autonomously browse, transact, and make decisions, users and institutions across industries will need cryptographic guarantees, not "best-effort trust."
This is why I believe we need "Secrets as a Service": the ability to provide programmable, native data access rules; client-side encryption; and decentralized key management to enforce who can decrypt what, under what conditions, for how long... all enforced on-chain. Combined with verifiable data systems, secrets can become part of the internet's basic public infrastructure—not an application-level privacy patch added after the fact—making privacy core infrastructure.
~Adeniyi Abiodun, Mysten Labs Chief Product Officer & Co-Founder
From "Code is Law" to "Spec is Law"
Recent DeFi hacks have affected battle-tested protocols with strong teams, rigorous audits, and years of operation. These incidents highlight a disturbing reality: current standard security practices remain largely heuristic and ad-hoc.
To mature, DeFi security needs to move from vulnerability patterns to design-level properties, from "best-effort" to a "principled" approach:
On the static/pre-deployment side (testing, auditing, formal verification), this means systematically proving global invariants, rather than verifying hand-picked local invariants. AI-assisted proving tools, currently being built by several teams, can help write specifications, propose invariants, and shoulder the bulk of the manual proof engineering that has made this work prohibitively expensive in the past.
On the dynamic/post-deployment side (runtime monitoring, runtime enforcement, etc.), these invariants can be translated into real-time guardrails: a last line of defense. These guardrails would be directly encoded as runtime assertions that every transaction must satisfy.
So, instead of assuming every vulnerability is found, we would now enforce critical security properties in the code itself, automatically rolling back any transaction that violates these properties.
This isn't just theoretical. In practice, almost all exploits to date would have triggered one of these checks during execution, potentially stopping the hacks. Thus, the once-popular "code is law" ethos evolves into "spec is law": even novel attacks must satisfy the same security properties that keep the system intact, so the only attacks left would be minuscule or extremely difficult to execute.
~Daejun Park, a16z crypto engineering team
On Other Industries & Applications
Prediction Markets Get Bigger, Broader, Smarter
Prediction markets have gone mainstream, and in the coming year, as they intersect with crypto and AI, they will only get bigger, broader, and smarter—while also presenting new important challenges for builders to solve.
First, many more contracts will be listed. This means we'll get real-time odds not just for major elections or geopolitical events, but for a wide variety of deeply detailed outcomes and complex, cross-cutting events. As these new contracts surface more information and become part of the news ecosystem (already happening), they will raise important societal questions about how we balance the value of this information and how to better design them to be more transparent, auditable, etc.—precisely what crypto enables.
To handle a much larger volume of contracts, we need new ways to agree on truth to settle contracts. Ruling by centralized platform (did an event actually happen? How do we confirm?) is important, but disputed cases like the Zelenskyy lawsuit market and the Venezuela election market show its limits. To resolve these edge cases and help prediction markets scale to more useful applications, novel forms of decentralized governance and LLM oracles can help determine the truth of disputed outcomes.
AI opens up more possibilities for oracles beyond LLMs. For example, AI agents trading on these platforms could scour the globe for signals, helping provide short-term trading advantages, thus revealing new ways of thinking about the world and predicting the future. (Projects like Prophet Arena already hint at the excitement in this space.) Beyond being complex political analysts we can query for insights, these agents, when we study their emergent strategies, might also reveal new things about the fundamental predictors of complex social events.
Will prediction markets replace polling? No; they make polling better (and polling information can feed into prediction markets). As a political scientist, I'm most excited about how prediction markets can operate synergistically with a rich and vibrant polling ecosystem—but we'll need to rely on new technologies like AI, which can improve the survey experience; and crypto, which can provide new ways to prove poll/survey respondents are not bots but humans, etc.
~Andy Hall, a16z crypto research advisor and Stanford political economy professor
The Rise of Staked Media
The cracks in the traditional media model—and its purported objectivity—have been showing for some time. The internet gave everyone a voice, and now there are more operators, practitioners, and builders speaking directly to the public. Their views reflect their stakes in the world, and counterintuitively, audiences respect them often not for being disinterested, but precisely for being interested.
The new thing here isn't the rise of social media, but the arrival of crypto tools that allow people to make publicly verifiable commitments. As AI makes generating infinite content cheap and easy—claiming anything, from any viewpoint or identity, real or fictional—relying solely on what people (or bots) say might feel insufficient. Tokenized assets, programmable staking, prediction markets, and on-chain histories provide a firmer foundation for trust: commentators can make arguments while proving they put their money where their mouth is. Podcast hosts can lock up tokens to show they won't opportunistically flip or "pump and dump." Analysts can tie predictions to publicly settling markets, creating auditable track records.
This is what I see as the early form of "staked media": a type of media that not only embraces the idea of having skin in the game, but provides proof. In this model, credibility comes not from pretending detachment, nor from making unsubstantiated claims; instead, it comes from having stakes for which you can make transparent and verifiable commitments. Staked media won't replace other forms of media, it's a complement to the media we already have. It offers a new signal: not just "trust me, I'm neutral," but "here's the risk I'm willing to take, and how you can check I'm telling the truth."
~Robert Hackett, a16z crypto editorial
Crypto Offers a New Primitive Beyond Blockchains
For years, SNARKs—a type of cryptographic proof for verifying computation without re-executing it—were largely just a blockchain technology. The overhead was simply too high: proving a computation could take 1,000,000x more work than just running it. It was worth it when amortized across thousands of verifiers, but impractical elsewhere.
This is about to change. In 2026, zkVM provers will reach roughly 10,000x overhead, with memory footprints in the hundreds of megabytes—fast enough to run on phones, cheap enough to run anywhere. Why might 10,000x be a magic number? One reason: the parallel throughput of a high-end GPU is roughly 10,000x that of a laptop CPU. By the end of 2026, a single GPU will be able to generate proofs for CPU executions in real-time.
This could unlock a vision from old research papers: verifiable cloud computing. If you're running CPU workloads in the cloud anyway—because your computation isn't heavy enough to be GPU-ized, or you lack the know-how, or for legacy reasons—you'll be able to get cryptographic proofs of correctness for a reasonable price overhead. Provers are optimized for GPUs; your code doesn't need to be.
~Justin Thaler, a16z crypto research and Georgetown University associate professor of computer science
On Building
Trading is a Waystation, Not the Destination, for Crypto Businesses
It seems like today, aside from stablecoins and some core infrastructure, every crypto company doing reasonably well is pivoting to or is already in trading. But what does it mean for everyone if "every crypto company becomes an exchange"? So many players doing the same thing will cannibalize mindshare for most, leaving only a few big winners. This means those who pivot to trading too early miss the opportunity to build a more defensible, longer-lasting business.
While I deeply understand all the founders trying to make their company's financials work, chasing instant product-market fit comes at a cost. This problem is especially acute in crypto, where the unique dynamics around tokens and speculation can steer founders down the path of instant gratification in their search for product-market fit.... It's arguably a marshmallow test.
There's nothing wrong with trading itself—it's an important market function—but it's not necessarily the final destination. Those who focus on the "product" part of product-market fit might end up being the bigger winners.
~Arianna Simpson, a16z crypto General Partner
Unlocking the Full Potential of Blockchains... When the Legal Architecture Finally Matches the Technical Architecture
The biggest obstacle to building blockchain networks in the US over the past decade has been legal uncertainty. Securities laws were broadly interpreted and selectively enforced, forcing founders into a regulatory framework built for companies, not networks. For years, mitigating legal risk replaced product strategy; engineers gave way to lawyers.
This dynamic led to many strange contortions: founders were told to avoid transparency. Token distributions became legally arbitrary. Governance became theater. Organizational structures were optimized for legal cover. Tokens were designed to avoid economic value/have no business model. Worse, crypto projects that gamed the rules often outperformed those building in good faith.
But crypto market structure regulation—which the government is closer than ever to passing—has the potential to undo all these contortions next year. If passed, this legislation would incentivize transparency, create clear standards, and replace "regulation by enforcement roulette" with clearer, structured paths for funding, token issuance, and decentralization. Following GENIUS, the explosion in stablecoins has already happened; legislation around crypto market structure would be an even more significant shift, but this time for networks.
In other words, such regulation would enable blockchain networks to operate as networks—open, autonomous, composable, credibly neutral, and decentralized.