infinicode-ollama
v1.0.3
Published
OpenCode AI coding agent with native Ollama support - connect to any Ollama master server
Maintainers
Readme
opencode-ollama
AI coding agent with native Ollama support. Fork of OpenCode modified to work seamlessly with any Ollama server.
Features
- 🚀 Native Ollama integration - no hacks or workarounds
- 🌐 Connect to any Ollama master server
- 🤖 Auto-detects available models on your Ollama server
- 💻 Works with local or remote Ollama instances
- ⚡ Same great OpenCode TUI experience
Installation
npm install -g opencode-ollamaUsage
Connect to Local Ollama
opencode-ollamaConnect to Remote Ollama Server
OLLAMA_HOST=http://192.168.1.100:11434 opencode-ollamaSet Default Server
Add to your .bashrc or .zshrc:
export OLLAMA_HOST=http://your-ollama-server:11434Then just run:
opencode-ollamaSupported Models
Works with any model available on your Ollama server:
- qwen3:32b, qwen3.5:35b
- deepseek-r1
- llama3, codellama
- mistral, mixtral
- Any custom fine-tuned models
Ollama Setup
Make sure Ollama is configured to accept network connections:
# On your Ollama server
OLLAMA_HOST=0.0.0.0:11434 ollama serveOr via systemd:
sudo systemctl edit ollama
# Add: Environment="OLLAMA_HOST=0.0.0.0:11434"
sudo systemctl restart ollamaOriginal Project
This is a fork of OpenCode with added Ollama support.
License
MIT
