Bitfinex宣布与Komainu建立数字资产托管解决方案合作伙伴关系

币界网Pubblicato 2024-08-15Pubblicato ultima volta 2024-08-15

币界网报道:

Bitfinex宣布计划整合受监管的数字资产托管人Komainu Connect。此次合作旨在使机构和投资者能够在Bitfinex的深度流动性平台上进行交易,同时确保他们的资产在受监管的托管环境中持有。

由Ledger Tradelink提供支持

据Bitfinex称,这种合作关系是由Ledger Enterprise的场外交易和结算技术Ledger Tradelink实现的。这项技术将允许Bitfinex的机构客户在不频繁移动链上资产的情况下进行交易。

通过与Komainu Connect整合,Bitfinex承诺用户可以进入流动性加密货币市场,同时他们的资产仍存储在受监管的托管人Komainu手中。据该公司称,这种设置最大限度地减少了频繁的链上转移的需求,而链上转移通常既繁琐又昂贵。

这种合作得到了Ledger Tradelink的支持,这是一种促进场外交易和结算的技术。在Komainu Connect上拥有资产并拥有经过验证的Bitfinex账户的交易者可以分配一部分托管资产进行交易。

一旦在Komainu启动锁定,系统将通过API连接通知Bitfinex,交易者将收到相当于锁定金额的余额,以便在Bitfinex上进行无缝交易。

据报道,这种整合是Bitfinex提供远程托管解决方案的更广泛战略的一部分,该战略建立在与其他专注于机构客户的托管合作伙伴的合作基础上。

Bitfinex的其他合作

今年年初,Bitfinex与Synonym合作推出了一项增强闪电网络交易的功能。这一合作旨在消除与建立闪电存款、取款和支付渠道相关的挑战。

此外,Bitfinex Securities去年通过其代币化债券ALT2612筹集了价值520万美元的USDT。Mikro Kapital旗下的卢森堡证券化基金ALTERNATIVE促成了此次融资。

Letture associate

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

In a span of four days, Amazon announced an additional $25 billion investment, and Google pledged up to $40 billion—both direct competitors pouring over $65 billion into the same AI startup, Anthropic. Rather than a typical venture capital move, this signals the latest escalation in the cloud wars. The core of the deal is not equity but compute pre-orders: Anthropic must spend the majority of these funds on AWS and Google Cloud services and chips, effectively locking in massive future compute consumption. This reflects a shift in cloud market dynamics—enterprises now choose cloud providers based on which hosts the best AI models, not just price or stability. With OpenAI deeply tied to Microsoft, Anthropic’s Claude has become the only viable strategic asset for Google and Amazon to remain competitive. Anthropic’s annualized revenue has surged to $30 billion, and it is expanding into verticals like biotech, positioning itself as a cross-industry AI infrastructure layer. However, this funding comes with constraints: Anthropic’s independence is challenged as it balances two rival investors, its safety-first narrative faces pressure from regulatory scrutiny, and its path to IPO introduces new financial pressures. Globally, this accelerates a "tri-polar" closed-loop structure in AI infrastructure, with Microsoft-OpenAI, Google-Anthropic, and Amazon-Anthropic forming exclusive model-cloud alliances. In contrast, China’s landscape differs—investments like Alibaba and Tencent backing open-source model firm DeepSeek reflect a more decoupled approach, though closed-source models from major cloud providers still dominate. The $65 billion bet is ultimately about securing a seat at the table in an AI-defined future—where missing the model layer means losing the cloud war.

marsbit5 h fa

Google and Amazon Simultaneously Invest Heavily in a Competitor: The Most Absurd Business Logic of the AI Era Is Becoming Reality

marsbit5 h fa

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

DeepSeek-V4 has been released as a preview open-source model, featuring 1 million tokens of context length as a baseline capability—previously a premium feature locked behind enterprise paywalls by major overseas AI firms. The official announcement, however, openly acknowledges computational constraints, particularly limited service throughput for the high-end DeepSeek-V4-Pro version due to restricted high-end computing power. Rather than competing on pure scale, DeepSeek adopts a pragmatic approach that balances algorithmic innovation with hardware realities in China’s AI ecosystem. The V4-Pro model uses a highly sparse architecture with 1.6T total parameters but only activates 49B during inference. It performs strongly in agentic coding, knowledge-intensive tasks, and STEM reasoning, competing closely with top-tier closed models like Gemini Pro 3.1 and Claude Opus 4.6 in certain scenarios. A key strategic product is the Flash edition, with 284B total parameters but only 13B activated—making it cost-effective and accessible for mid- and low-tier hardware, including domestic AI chips from Huawei (Ascend), Cambricon, and Hygon. This design supports broader adoption across developers and SMEs while stimulating China's domestic semiconductor ecosystem. Despite facing talent outflow and intense competition in user traffic—with rivals like Doubao and Qianwen leading in monthly active users—DeepSeek has maintained technical momentum. The release also comes amid reports of a new funding round targeting a valuation exceeding $10 billion, potentially setting a new record in China’s LLM sector. Ultimately, DeepSeek-V4 represents a shift toward open yet realistic infrastructure development in the constrained compute landscape of Chinese AI, emphasizing engineering efficiency and domestic hardware compatibility over pure model scale.

marsbit6 h fa

Computing Power Constrained, Why Did DeepSeek-V4 Open Source?

marsbit6 h fa

Trading

Spot
Futures
活动图片