LLM Providers
Configure cloud and local server providers to power your conversations.
Cloud Providers
OpenAI
Access GPT-4, GPT-3.5, and other OpenAI models.
Setup Steps:
- Get your API key from OpenAI Platform
- Go to Settings → Manage LLM Providers → Add Provider
- Select "OpenAI" as the provider type
- Enter your API key and save
Anthropic
Access Claude models including Claude 3.5 Sonnet, Claude 3 Opus, and more.
Setup Steps:
- Get your API key from Anthropic Console
- Go to Settings → Manage LLM Providers → Add Provider
- Select "Anthropic" as the provider type
- Enter your API key and save
OpenAI-Compatible APIs
Connect to any service that implements the OpenAI API format, such as LM Studio, Together AI, or Groq.
Setup Steps:
- Go to Settings → Manage LLM Providers → Add Provider
- Select "OpenAI Spec" as the provider type
- Enter the base URL of your API endpoint
- Add your API key if required
- Specify a default model name
Local Server Providers
Ollama
Connect to a locally running Ollama server for privacy and offline use.
Prerequisites:
- Install Ollama from ollama.ai
- Pull a model:
ollama pull llama3.2 - Ensure Ollama is running (default: http://localhost:11434)
Setup Steps:
- Go to Settings → Manage LLM Providers → Add Provider
- Select "Ollama" as the provider type
- Enter the server URL (default: http://localhost:11434)
- Test the connection to verify it works
Note: Ollama connection is not available in the web version due to browser CORS restrictions. Use the Android app for local Ollama support.
On-Device Providers
For completely local, on-device inference without any server, see the Local Models documentation.