LLM providers

DlxAI is LLM-neutral. Use cloud credits (default, routed via OpenRouter), or bring your own API keys for any major model.

Three access modes

  • Credits (default): Just sign in. The DlxAI backend pays for OpenRouter calls and deducts credits per token.
  • Self-hosted: Bring your own API keys. Fully local. Zero cloud dependency.
  • Subscription / coding plans: Connect Claude / Gemini / Kimi etc. subscription accounts. No API key needed.

Supported providers

Mainstream

  • OpenAI (GPT-4 / 4o / o1)
  • Anthropic (Claude Opus / Sonnet / Haiku)
  • Google (Gemini Pro / Flash)
  • xAI (Grok)

Open source / cost-effective

  • DeepSeek
  • Mistral
  • Groq (ultra-fast inference)
  • OpenRouter (aggregates everything)

Chinese providers

  • Zhipu AI / Z.ai (GLM family)
  • Moonshot / Kimi
  • Qwen (Alibaba)
  • Volcengine / Doubao (ByteDance)
  • MiniMax
  • Xiaomi / MiMo

Local

  • Ollama — run open-source models on your own machine, zero cloud

Enterprise / cloud-native

  • Amazon Bedrock
  • NVIDIA NIM
  • Venice AI

Add an API key

  1. Click Providers in the sidebar
  2. Pick a provider and click "Add key"
  3. Paste the key (encrypted via Keychain / DPAPI on save)
  4. Pick a default model
  5. Save — applies immediately

Per-provider proxy

In a restricted region? Configure HTTP / SOCKS5 proxies per provider or per API key. The DlxAI proxy router routes traffic by hostname automatically, leaving other traffic untouched.

Hot reload: Add a key, swap providers, change a proxy — everything applies instantly without restarting the gateway.