Ecosystem Components
The EdgeMob ecosystem is built from modular components that work together to transform smartphones into a distributed AI compute fabric. Each layer is designed to balance performance, scalability, and decentralization, while keeping the developer experience simple and accessible.
1. Mobile Nodes
Smartphones running the EdgeMob app become active compute nodes.
Capable of hosting local inference, exposing APIs, and contributing to distributed LLM execution.
Provide CPU, RAM, and NPUs (where available) in exchange for EGMO token incentives.
2. Orchestrator
A lightweight runtime inside the app, implemented in Rust and C++.
Manages model execution, memory allocation, and workload scheduling.
Handles communication with other nodes and ensures inference results are reliable and efficient.
3. API Gateway
Acts as the bridge between applications and EdgeMob nodes.
Operates in three modes:
Local: localhost endpoint for on-device testing.
Centralized: Phase 1 gateway for routing requests reliably.
Decentralized: Phase 2+ routing layer using WebSockets and Solana smart contracts.
Enables seamless scaling from prototype to production workloads.
4. Developer Tooling
SDKs, CLI, and libraries for Web2 and Web3 developers.
Model registry to publish, share, and access open-source or custom models.
Monitoring and logging tools to track performance, usage, and cost efficiency.
5. EGMO Token Layer
Powers incentives, utility, and governance across the network.
Node operators earn EGMO for contributing compute cycles.
Developers and dApps spend EGMO to deploy models, run inference, and access distributed compute.
Staking mechanisms ensure reliable participation and long-term sustainability.
6. Solana Integration
Provides the backbone for coordination, payments, and governance.
Smart contracts manage request routing, settlement, and staking.
Wallet-based access enables Web3-native features such as token-gated APIs.
7. Model Support & Extensions
Supports open-source models (LLaMA, Mistral, Qwen, etc.) as well as custom developer models.
Extensible via Model Context Protocol (MCP) servers and plugin-like integrations.
Future roadmap includes background compute, fine-tuning, and distributed training on mobile clusters.
Together, these components form the EdgeMob ecosystem: a mobile-native AI compute network that is decentralized, developer-friendly, and designed to unlock the latent power of billions of devices worldwide.
Last updated
