Ollama Provider
Ollama Provider
Section titled “Ollama Provider”The Ollama provider connects Swixter to a local Ollama instance for running models locally.
Configuration
Section titled “Configuration”swixter claude create my-ollama-profile \ --provider ollama \ --base-url http://localhost:11434/v1 \ --model llama3Provider Details
Section titled “Provider Details”| Field | Value |
|---|---|
| Provider ID | ollama |
| Wire API | chat |
| Default Base URL | http://localhost:11434/v1 |
| Env Key | OLLAMA_API_KEY |
- Install Ollama: https://ollama.com
- Pull a model:
ollama pull llama3 - Create a Swixter profile:
swixter claude create local \ --provider ollama \ --base-url http://localhost:11434/v1 \ --model llama3Base URL
Section titled “Base URL”Ollama typically runs on port 11434. The API path is /v1 for OpenAI-compatible endpoints:
http://localhost:11434/v1If you’re running Ollama on a different machine, update the base URL:
swixter claude create remote-ollama \ --provider ollama \ --base-url http://192.168.1.100:11434/v1API Key
Section titled “API Key”Ollama doesn’t require an API key by default. Swixter uses OLLAMA_API_KEY as the environment variable name for compatibility, but you can leave it empty or set a placeholder.
Compatibility
Section titled “Compatibility”- Compatible with Claude Code, Codex, and Continue.dev
wire_api: "chat"type works with all coders- For Codex, the env_key defaults to
OLLAMA_API_KEY
Custom env_key for Codex
Section titled “Custom env_key for Codex”swixter codex create ollama-codex \ --provider ollama \ --env-key MY_CUSTOM_KEY