API Gateway
The EdgeMob API Gateway is the critical bridge between applications and the distributed network of mobile nodes. It allows developers to expose AI models as callable endpoints, making it possible to integrate mobile-native compute into web apps, dApps, and enterprise systems. The gateway is designed to evolve through three key stages, balancing simplicity, reliability, and decentralization.
1. Local Mode
Provides a localhost endpoint for developers running models directly on their own device.
Ideal for prototyping, debugging, and testing AI models without requiring internet connectivity.
Ensures developers can validate inference outputs quickly before scaling.
2. Centralized Gateway (Phase 1)
A managed EdgeMob routing service connects applications to active devices.
Ensures stability and reliability for early deployments while the network matures.
Provides load balancing, logging, and usage metrics for developers.
Suitable for small-to-medium scale projects that require consistent availability.
3. Decentralized Gateway (Phase 2+)
Evolves into a fully decentralized routing layer.
Uses WebSockets for real-time communication and Solana smart contracts for coordination, payments, and staking.
Applications send requests, which are routed to available nodes based on capacity, reputation, and staking commitments.
Removes single points of failure and ensures transparent, censorship-resistant access.
Features Across All Modes
Web2 & Web3 Compatibility: REST, GraphQL, and RPC endpoints supported.
Monitoring & Metrics: Developers can track latency, throughput, and node reliability.
Security: Encryption and staking-backed reliability ensure secure communication and execution.
By providing a clear evolution path—local, centralized, then decentralized—the EdgeMob API Gateway lowers the barrier for developers to adopt mobile-native compute while paving the way for a fully decentralized, scalable AI infrastructure.
Last updated
