Welcome to ModelFlux
Your private workspace for interacting with Large Language Models, both local and remote.
Getting Started
ModelFlux supports multiple providers: cloud APIs (OpenAI, Anthropic), local servers (Ollama), and on-device models (ExecuTorch, Llama.cpp). Choose what works best for your needs.
Core Features
LLM Providers
Connect to OpenAI, Anthropic, Ollama, or any OpenAI-compatible API endpoint.
Local Models
Run AI models on your device with ExecuTorch (.pte) or Llama.cpp (.gguf) support.
RAG & Sources
Add documents to your conversations for context-aware AI responses.
Personas
Create custom AI personalities with system prompts (Character Card V2 spec).
Supported Providers
OpenAI
GPT-4, GPT-3.5
Anthropic
Claude 3.5, Claude 3
Ollama
Local server
OpenAI Spec
Custom endpoints
ExecuTorch
On-device (.pte)
Llama.cpp
On-device (.gguf)
Platform Support
Android
Full support including local models with ExecuTorch and Llama.cpp.
Tablet
Optimized layout for larger screens with full feature support.
Web
Remote providers only (OpenAI, Anthropic). Local models require native app.
Web Support Limitations
Important Note for Web Users
Due to browser security restrictions (CORS and Mixed Content), the Web version of ModelFlux only supports remote providers (like OpenAI and Anthropic).
To use local models or connect to a local Ollama server, please use our Android application.
Data & Privacy
- Local Storage - All conversations and settings are stored on your device using WatermelonDB
- No Tracking - We don't collect any analytics or usage data
- Export/Import - Backup your data anytime as JSON files
- Open Source - Full transparency with source code on GitHub