Breaking: CFTC Chair Reiterates "ETH Is Commodity" Stance Amid SEC Lawsuits

CoingapePubblicato 2023-06-07Pubblicato ultima volta 2023-06-07

Introduzione

Speaking in a US House Committee hearing on “The Future of Digital Assets: Providing Clarity for Digital Asset Spot Markets,” Behnam reiterated his stance on Ethereum being a commodity.


Crypto Market News: US Commodity Futures Trading Commission (CFTC) Chairman Rostin Behnam on Tuesday reiterated his stance that Ethereum is a commodity and not security. This comes at a time when the U.S. Securities and Exchange Commission (SEC) filed back to back complaints against two of the world’s top crypto exchanges, Binance and Coinbase over violation of financial regulations. The SEC had been emphasizing that crypto businesses operating in the United States come forward and register digital asset trade related operations with it. However, there is no clarity as to which laws would the crypto assets fit under.


Gary Gensler, the SEC Chair had recently stirred a controversy in the crypto market by saying that everything other than Bitcoin is a security. He maintains that Bitcoin is not a security but a commodity under the CFTC’s jurisdiction. On the contrary, CFTC Chair Behnam had been saying that Ethereum qualifies as a commodity. This confusion and uncertainty, as things stand, is still not resolved although efforts are on with a draft bill to categorize which assets are securities and commodities.
“Ethereum Is A Commodity”

Speaking in a US House Committee hearing on “The Future of Digital Assets: Providing Clarity for Digital Asset Spot Markets,” Behnam reiterated his stance on Ethereum being a commodity. The question arose when Congressman Austin Scott said crypto assets were clearly not securities and that the assets should be regulated by the CFTC and not the SEC. The CFTC Chair replied,
“I’ve argued in the past that Ether is a commodity.”






Letture associate

Where Is the AI Infrastructure Industry Chain Stuck?

The AI infrastructure (AI Infra) industry chain is facing unprecedented systemic bottlenecks, despite the rapid emergence of applications like DeepSeek and Seedance 2.0. The surge in global computing demand has exposed critical constraints across multiple layers of the supply chain—from core manufacturing equipment and data center cabling to specialty materials and cleanroom facilities. Key challenges include four major "walls": - **Memory Wall**: High-bandwidth memory (HBM) and DRAM face structural shortages as AI inference demand outpaces training, with new capacity not expected until 2027. - **Bandwidth Wall**: Data transfer speeds lag behind computing power, causing multi-level bottlenecks in-chip, between chips, and across data centers. - **Compute Wall**: Advanced chip manufacturing, reliant on EUV lithography and monopolized by ASML, remains the fundamental constraint, with supply chain fragility affecting production. - **Power Wall**: While energy demand from data centers is rising, power supply is a solvable near-term challenge through diversified energy infrastructure. Expansion is further hindered by shortages in testing equipment, IC substrates (critical for GPUs and seeing price hikes over 30%), specialty materials like low-CTE glass fiber, and high-end cleanroom facilities. Connection technologies are evolving, with copper cables resurging for short-range links due to cost and latency advantages, while optical solutions dominate long-range scenarios. Innovations like hollow-core fiber and advanced PCB technologies (e.g., glass substrates, mSAP) are emerging to meet bandwidth needs. In summary, AI Infra bottlenecks are multidimensional, spanning compute, memory, bandwidth, power, and supply chain logistics. Advanced chip manufacturing remains the core constraint, while substrate, material, and equipment shortages present immediate challenges. The industry is moving toward hybrid copper-optical solutions and accelerated domestic supply chain development.

marsbit49 min fa

Where Is the AI Infrastructure Industry Chain Stuck?

marsbit49 min fa

Autonomy or Compatibility: The Choice Facing China's AI Ecosystem Behind the Delay of DeepSeek V4

DeepSeek V4's repeated delay in early 2026 has sparked global discussions on "de-CUDA-ization" in AI. The highly anticipated trillion-parameter open-source model is undergoing deep adaptation to Huawei’s Ascend chips using the CANN framework, representing China’s first systematic attempt to run a core AI model outside the CUDA ecosystem. This shift, however, comes with significant engineering challenges. While the model uses a MoE architecture to reduce computational load, it places extreme demands on memory bandwidth, chip interconnects, and system scheduling—areas where NVIDIA’s mature CUDA ecosystem currently excels. Migrating to Ascend introduces complexities in hardware topology, communication latency, and software optimization due to CANN’s relative immaturity compared to CUDA. The move highlights a broader strategic dilemma: short-term compatibility with CUDA offers practical benefits and faster adoption, as seen in CANN’s efforts to emulate CUDA interfaces. Yet, long-term over-reliance on compatibility risks inheriting CUDA’s limitations and stifling native innovation. If global AI shifts away from transformer-based architectures, strict compatibility could lead to technological obsolescence. Despite these challenges, DeepSeek V4’s eventual release could demonstrate the viability of a full domestic AI stack and accelerate CANN’s ecosystem growth. However, true technological independence will require building an original software-hardware paradigm beyond compatibility—a critical task for China’s AI ambitions in the next 3-5 years.

marsbit1 h fa

Autonomy or Compatibility: The Choice Facing China's AI Ecosystem Behind the Delay of DeepSeek V4

marsbit1 h fa

Trading

Spot
Futures
活动图片