RuvLTRA Small
π± Compact Model Optimized for Edge Devices
Quick Start β’ Use Cases β’ Integration
Overview
RuvLTRA Small is a compact 0.5B parameter model designed for edge deployment. Perfect for mobile apps, IoT devices, and resource-constrained environments.
Model Card
| Property | Value |
|---|---|
| Parameters | 0.5 Billion |
| Quantization | Q4_K_M |
| Context | 4,096 tokens |
| Size | ~398 MB |
| Min RAM | 1 GB |
π Quick Start
# Download
wget https://huggingface.co/ruv/ruvltra-small/resolve/main/ruvltra-0.5b-q4_k_m.gguf
# Run with llama.cpp
./llama-cli -m ruvltra-0.5b-q4_k_m.gguf -p "Hello, I am" -n 64
π‘ Use Cases
- Mobile Apps: On-device AI assistant
- IoT: Smart home device intelligence
- Edge Computing: Local inference without cloud
- Prototyping: Quick model experimentation
π§ Integration
Rust (RuvLLM)
use ruvllm::hub::ModelDownloader;
let path = ModelDownloader::new()
.download("ruv/ruvltra-small", None)
.await?;
Python
from huggingface_hub import hf_hub_download
model = hf_hub_download("ruv/ruvltra-small", "ruvltra-0.5b-q4_k_m.gguf")
Hardware Support
- β Apple Silicon (M1/M2/M3)
- β NVIDIA CUDA
- β CPU (x86/ARM)
- β Raspberry Pi 4/5
License: Apache 2.0 | GitHub: ruvnet/ruvector
- Downloads last month
- 12
Hardware compatibility
Log In
to view the estimation
4-bit