Technology Stack
The EdgeMob technology stack is purpose-built to combine mobile-native flexibility with the performance of low-level systems programming. It brings together proven open-source frameworks and optimized runtimes to deliver efficient AI compute directly on smartphones.
Core Layers
React Native (RN)
Provides the mobile application shell for both Android and iOS.
Ensures a unified developer experience with cross-platform UI and interaction capabilities.
Integrates seamlessly with native modules for performance-critical tasks.
Rust Runtime
The backbone of EdgeMob’s orchestrator.
Offers memory safety, performance, and concurrency without the overhead of garbage collection.
Manages scheduling, memory allocation, and communication between model execution layers.
C++ Modules
Used for high-performance compute tasks where low-level optimization is critical.
Provides bindings to hardware acceleration libraries and supports optimized math operations.
Ensures interoperability with established AI libraries.
Candle Framework
A Rust-based deep learning framework integrated into EdgeMob for running AI models efficiently on mobile.
Supports safetensors and quantized model execution for reduced memory footprint.
Optimized for inference in resource-constrained environments.
llama.cpp
A lightweight C++ implementation for running large language models efficiently on consumer hardware.
Used to support popular LLaMA-family and derivative models.
Offers quantization support (Q4, Q5, Q8) to make large models feasible on mobile devices.
Integration Model
React Native provides the interface layer, while Rust and C++ handle the heavy compute under the hood.
Models can be executed using Candle for Rust-native inference or llama.cpp for optimized LLaMA-family support.
This modular approach ensures flexibility, allowing EdgeMob to quickly integrate new frameworks as the AI ecosystem evolves.
By combining React Native, Rust, C++, Candle, and llama.cpp, EdgeMob delivers a technology stack that is cross-platform, efficient, and extensible—designed to bring high-performance AI compute into the hands of billions of mobile users.
Last updated
