Skip to content

Ollama Provider

The Ollama provider connects Swixter to a local Ollama instance for running models locally.

Terminal window
swixter claude create my-ollama-profile \
--provider ollama \
--base-url http://localhost:11434/v1 \
--model llama3
FieldValue
Provider IDollama
Wire APIchat
Default Base URLhttp://localhost:11434/v1
Env KeyOLLAMA_API_KEY
  1. Install Ollama: https://ollama.com
  2. Pull a model: ollama pull llama3
  3. Create a Swixter profile:
Terminal window
swixter claude create local \
--provider ollama \
--base-url http://localhost:11434/v1 \
--model llama3

Ollama typically runs on port 11434. The API path is /v1 for OpenAI-compatible endpoints:

http://localhost:11434/v1

If you’re running Ollama on a different machine, update the base URL:

Terminal window
swixter claude create remote-ollama \
--provider ollama \
--base-url http://192.168.1.100:11434/v1

Ollama doesn’t require an API key by default. Swixter uses OLLAMA_API_KEY as the environment variable name for compatibility, but you can leave it empty or set a placeholder.

  • Compatible with Claude Code, Codex, and Continue.dev
  • wire_api: "chat" type works with all coders
  • For Codex, the env_key defaults to OLLAMA_API_KEY
Terminal window
swixter codex create ollama-codex \
--provider ollama \
--env-key MY_CUSTOM_KEY