EGMO Token Airdrops are coming

Developer App & Workflow

The EdgeMob Developer App is the entry point for builders who want to run and expose AI models directly on mobile devices. It is designed to abstract away the complexity of hardware management, model integration, and networking, so developers can focus on building intelligent applications.

Key Functions

  1. Model Loading Developers can load both open-source and custom models into the EdgeMob app. The app provides an intuitive interface to select models from a registry, sideload them, or import them from local storage. Once loaded, the orchestrator manages model execution efficiently within the device’s compute limits.

  2. Local Inference For testing and prototyping, models can be run directly on the device, with outputs exposed over a localhost API. This enables developers to quickly validate functionality without internet connectivity or reliance on cloud services.

  3. API Exposure Developers can expose their models as APIs via the EdgeMob Gateway. This makes them callable by web apps, dApps, or other services. In Phase 1, this routing is managed through a centralized gateway for stability. In later phases, the gateway becomes decentralized, connecting clients to active devices via WebSockets and Solana contracts.

  4. Developer Tooling The app integrates with EdgeMob SDKs and CLI tools, allowing developers to configure endpoints, monitor usage, and integrate results directly into applications. Support for both REST and Web3-compatible RPC ensures broad accessibility.

Workflow Example

  1. Load a small LLM model into the EdgeMob app.

  2. Run test inference locally using the localhost endpoint.

  3. Expose the model via the EdgeMob Gateway for external applications.

  4. Track performance and usage through developer dashboards and SDKs.

This workflow ensures developers can start small, iterate quickly, and scale their solutions as EdgeMob’s decentralized gateway and distributed compute capabilities mature.

Last updated