Welcome to ModelFlux

Your private workspace for interacting with Large Language Models, both local and remote.

Getting Started

ModelFlux supports multiple providers: cloud APIs (OpenAI, Anthropic), local servers (Ollama), and on-device models (ExecuTorch, Llama.cpp). Choose what works best for your needs.


Core Features

Supported Providers

AI

OpenAI

GPT-4, GPT-3.5

C

Anthropic

Claude 3.5, Claude 3

O

Ollama

Local server

+

OpenAI Spec

Custom endpoints

E

ExecuTorch

On-device (.pte)

L

Llama.cpp

On-device (.gguf)

Platform Support

Android

Full support including local models with ExecuTorch and Llama.cpp.

Tablet

Optimized layout for larger screens with full feature support.

Web

Remote providers only (OpenAI, Anthropic). Local models require native app.

Web Support Limitations

Important Note for Web Users

Due to browser security restrictions (CORS and Mixed Content), the Web version of ModelFlux only supports remote providers (like OpenAI and Anthropic).

To use local models or connect to a local Ollama server, please use our Android application.

Data & Privacy

  • Local Storage - All conversations and settings are stored on your device using WatermelonDB
  • No Tracking - We don't collect any analytics or usage data
  • Export/Import - Backup your data anytime as JSON files
  • Open Source - Full transparency with source code on GitHub

Need Help?