Model providers

Connect Anthropic, OpenRouter, or an OpenAI-compatible endpoint for agent model calls.

Supported providers

dispatchmy.ai supports Anthropic, OpenRouter, and any OpenAI-compatible endpoint (Ollama, LM Studio, vLLM, or your own server).

Cloud provider keys are saved once in Settings. The daemon injects them server-side when it proxies a model call, so the key never reaches the agent container.

Local endpoints

Use an OpenAI-compatible endpoint for local model servers. Configure the base URL, model name, and whether the model supports tool calling.

Tool calling

Some local models cannot reliably emit tool calls. Disable tool calling until you verify the model can use tools correctly.