Ocean Network Launches Beta for Affordable P2P GPU Orchestration

TheNewsCrypto2026-03-17 tarihinde yayınlandı2026-03-17 tarihinde güncellendi

Özet

Ocean Network has launched the Beta of its decentralized peer-to-peer (P2P) GPU orchestration layer, creating a liquid on-demand compute market without centralized control. It enables data scientists and developers to run code directly, bypassing traditional cloud bottlenecks. The platform sources high-performance GPUs, including NVIDIA H200s and A100s, from partners like Aethir to ensure reliability. Its Orchestrator integrates with popular IDEs like VS Code, allowing custom hardware selection, one-click job deployment, and real-time monitoring. Ocean uses a Pay-Per-Use Escrow on Base L2, charging only for actual resource consumption, not idle time. It also employs Compute-to-Data (C2D) for secure processing of sensitive data without moving it. The network plans to soon allow node operators to monetize idle GPUs, expanding into a full P2P ecosystem.

Ocean Network today announced the official Beta launch of its decentralized peer-to-peer (P2P) compute orchestration layer. This marks a shift from fragmented hardware to a highly liquid market where compute is available on-demand, without the overhead of centralized gatekeepers. Powered by this architecture, Ocean Network allows modern data scientists and developers to bypass traditional cloud bottlenecks and move directly from code to execution.

Solving the “coordination problem” of decentralized compute

While the demand for high-performance GPUs has reached a fever pitch, decentralized compute has historically struggled with a usability gap. Most developers do not want to manage remote nodes, configure complex SSH keys, or gamble on unreliable uptime; they want to run code.

Ocean Network bridges this gap by focusing on the Orchestration Layer. To ensure top-tier reliability and performance from day one of Beta, Ocean Network is renting high-performance GPUs from Aethir, based on the partnership the two entered in 2025. This gives users immediate access to a massive fleet of industry-leading hardware, ranging from powerhouse NVIDIA H200s, H100s, and A100s to highly accessible 1060s and more.

“We aren’t just giving data scientists and developers access to GPUs; we are giving them an orchestration layer that makes decentralized compute feel like a local execution,” says the Ocean Network team. “This is the transition from manual infrastructure management to pure automatiON.”

Moving forward, Ocean Network will start aggregating global, idle GPUs into a unified P2P network, allowing anyone to set up an Ocean Node and monetize their high-performing underutilized compute resources.

Central to the Beta launch is the Ocean Orchestrator (formerly the Ocean VS Code Extension). Recognizing that the modern user’s workflow lives within their editor, the Orchestrator integrates natively with VS Code, Cursor, Windsurf, and Antigravity.

Unlike traditional cloud monopolies that force developers into expensive, rigid hardware tiers, Ocean Network offers total flexibility in resource allocation with no preset bundles. The UX is designed for granular control and speed:

  1. Custom Selection: Filter and select specific hardware models (e.g., Nvidia H200, A100, Tesla 4) and set the exact minimum requirements for CPU and RAM;
  2. ONe-Click Submission: Deploy containerized jobs (Python or JavaScript) with a single click once the precise environment is mapped;
  3. Real-Time Retrieval: Monitor the job live and automatically pull results back to the user’s local environment.

Pure AutomatiON: The Pay-Per-Use economics

Ocean Network challenges the “Reserved Instance” models of AWS and GCP. In traditional cloud environments, users pay for the time a machine is “ON,” regardless of whether it is actively computing or sitting idle.

Ocean Network introduces a Pay-Per-Use Escrow Mechanism deployed on Base (Ethereum L2) for low-fee, high-speed settlements. Funds are held in escrow and only released once the node successfully completes the job and returns the output. Users are charged strictly for the resources consumed by the specific job (time, hardware, and environment), effectively eliminating the cost of idle compute. All access and rewards are secured via wallet-based identity provided by Alchemy.

Security through Compute-to-Data (C2D)

For Web2 data scientists and AI agent aficionados handling sensitive data, Ocean utilizes Compute-to-Data (C2D). This architecture runs algorithms in isolated containers where the data resides. The raw data never leaves its perimeter; only the secure compute outputs are returned to the user.

Building the future of liquid compute

The Beta launch invites Web2 Data Scientists, Data Analysts, and Web3 Builders to experience a world where compute is a utility, not a bottleneck. While the initial Beta focus is on the demand side, that is empowering users to run jobs, the network will soon after expand to allow Node runners to monetize their idle high-power GPU and CPU capacity by joining the worker layer.

About Ocean Network

Ocean Network is a decentralized, peer-to-peer (P2P) compute network for pay-per-use compute jobs that turns idle or underutilized GPUs into usable distributed compute resources. It lets users choose a preferred Ocean Node with the resources the users need, submit a containerized job, and get results back without managing servers or infrastructure

CONTACT:

  • Name: Andreea Neagu
  • Job title: Marketing lead
  • Company: Ocean Network
  • Website: https://www.oncompute.ai/
  • Country: Singapore
  • Email: [email protected]

Disclaimer: TheNewsCrypto does not endorse any content on this page. The content depicted in this Press Release does not represent any investment advice. TheNewsCrypto recommends our readers to make decisions based on their own research. TheNewsCrypto is not accountable for any damage or loss related to content, products, or services stated in this Press Release.

TagsOcean NetworkPress Release

İlgili Sorular

QWhat is the primary innovation that Ocean Network's Beta launch introduces to decentralized compute?

AOcean Network's Beta launch introduces a decentralized peer-to-peer (P2P) compute orchestration layer, which solves the usability gap in decentralized computing by providing an interface that makes it feel like local execution, bypassing the need to manage remote nodes or complex configurations.

QHow does Ocean Network ensure reliability and performance for its users during the Beta phase?

ATo ensure top-tier reliability and performance from day one of Beta, Ocean Network is renting high-performance GPUs from Aethir, providing users immediate access to a massive fleet of industry-leading hardware, including NVIDIA H200s, H100s, A100s, and more.

QWhat is the Ocean Orchestrator and which development environments does it integrate with?

AThe Ocean Orchestrator is a central tool for the Beta launch (formerly the Ocean VS Code Extension) that integrates natively with popular IDEs like VS Code, Cursor, Windsurf, and Antigravity, allowing users to deploy and manage compute jobs directly from their editors.

QHow does Ocean Network's Pay-Per-Use Escrow Mechanism work and what problem does it solve?

AOcean Network's Pay-Per-Use Escrow Mechanism, deployed on Base (Ethereum L2), holds funds in escrow and only releases payment once a node successfully completes a job and returns the output. This eliminates the cost of idle compute, challenging the traditional cloud model where users pay for reserved instance time regardless of usage.

QWhat security measure does Ocean Network use for users handling sensitive data?

AFor users handling sensitive data, Ocean Network utilizes Compute-to-Data (C2D) architecture. This runs algorithms in isolated containers where the data resides, ensuring the raw data never leaves its secure perimeter; only the computed outputs are returned to the user.

İlgili Okumalar

NVIDIA Starts Installing Chips on Roads | Rewire Evening News Update

NVIDIA CEO Jensen Huang announced at GTC that the company's data center orders for Blackwell and Vera Rubin platforms are projected to exceed $1 trillion by 2027, doubling last year's estimates. He emphasized that computing demand will far surpass this figure. Beyond data centers, NVIDIA is expanding its autonomous driving ecosystem, adding BYD, Geely, Nissan, and Isuzu to its Drive Hyperion platform. A partnership with Uber aims to deploy robotaxis in Los Angeles and San Francisco by early 2027, expanding to 28 markets by 2028—a moment Huang calls "the ChatGPT moment for autonomous driving." In related news, Uber co-founder Travis Kalanick revealed his stealth robotics startup, Atoms, after eight years of operation. The company focuses on automating physical infrastructure, mining, and robotic platforms. Kalanick is reportedly acquiring autonomous driving firm Pronto, with Uber's support, signaling a strategic re-entry into automation. Meanwhile, Murata Manufacturing, the world's largest MLCC supplier with over 40% market share, raised prices for AI server and automotive-grade components by 15-35%, effective April 1. This marks its first major price hike in three years and highlights hidden cost pressures in AI infrastructure supply chains. The SEC is also considering allowing public companies to switch from quarterly to semi-annual financial reporting, reducing compliance costs and potentially benefiting tech firms making long-term AI investments. Additional updates include Alibaba providing employees with free AI tool tokens, FDIC moving to exclude stablecoins from deposit insurance, deepfake misinformation spreading during the Israel-Hamas war, and Picsart launching an AI Agent marketplace for creators.

marsbit21 dk önce

NVIDIA Starts Installing Chips on Roads | Rewire Evening News Update

marsbit21 dk önce

From FOMO to Implementation: A Review of the Current State of AI Services in Crypto Companies

From FOMO to Implementation: A Look at Crypto Companies' AI Services Cryptocurrency companies, from exchanges to security firms, are rapidly integrating AI-driven services, driven by FOMO (fear of missing out) rather than just hype. Unlike previous cycles, established players like Coinbase and Binance are leading the charge, treating AI as a business necessity rather than a narrative. Key sectors adopting AI include: - **Research**: Projects like Surf AI address crypto's fragmented data problem by offering specialized tools that aggregate on-chain data, social sentiment, and metrics, providing accurate, crypto-specific insights. - **Trading**: Exchanges are leveraging AI to allow natural language commands for analysis and execution, lowering the barrier for non-developers to create automated strategies via AI agents. - **Security/Audit**: Firms like CertiK use AI to enhance smart contract audits by combining automated code scanning with human review, and adding post-audit monitoring to cover previous blind spots. - **Payment Infrastructure**: Companies are developing protocols for AI agents to make on-chain payments, using stablecoins for API fees or services, with Circle’s proposal for AI-agent payments gaining attention. The push is fueled by AI advancements like MCP and OpenClaw, which make agent-based automation accessible. However, the adoption gap between "having functionality" and "actual usage" remains, with questions about user trust in AI for real trading or payments. Ultimately, crypto firms are acting to avoid obsolescence in the AI era, though real-world utility is still evolving.

比推1 saat önce

From FOMO to Implementation: A Review of the Current State of AI Services in Crypto Companies

比推1 saat önce

İşlemler

Spot
Futures
活动图片