@kud/mcp-opencode
v1.0.0
Published
MCP server for opencode — query github-copilot models via a persistent opencode server.
Maintainers
Readme
███╗ ███╗ ██████╗██████╗ ██████╗ ██████╗ ███████╗███╗ ██╗ ██████╗ ██████╗ ██████╗ ███████╗
████╗ ████║██╔════╝██╔══██╗ ██╔═══██╗██╔══██╗██╔════╝████╗ ██║██╔════╝██╔═══██╗██╔══██╗██╔════╝
██╔████╔██║██║ ██████╔╝ ██║ ██║██████╔╝█████╗ ██╔██╗ ██║██║ ██║ ██║██║ ██║█████╗
██║╚██╔╝██║██║ ██╔═══╝ ██║ ██║██╔═══╝ ██╔══╝ ██║╚██╗██║██║ ██║ ██║██║ ██║██╔══╝
██║ ╚═╝ ██║╚██████╗██║ ╚██████╔╝██║ ███████╗██║ ╚████║╚██████╗╚██████╔╝██████╔╝███████╗
╚═╝ ╚═╝ ╚═════╝╚═╝ ╚═════╝ ╚═╝ ╚══════╝╚═╝ ╚═══╝ ╚═════╝ ╚═════╝ ╚═════╝ ╚══════╝Query any opencode model from your AI assistant — no API key required.
Features • Quick Start • Installation • Tools • Development
Features
- 🤖 Query any model — send prompts to anthropic, github-copilot, google-vertex, and more
- 🔍 Discover models — list all models your opencode is configured for, optionally filtered by provider
- 🚀 Zero config auth — no API tokens; delegates authentication entirely to opencode
- ⚡ Auto-start — spins up the opencode server automatically if it isn't running
- 🛡️ Allow/block filters — restrict which models are accessible via
OPENCODE_MODEL_ALLOW/OPENCODE_MODEL_BLOCK - 🔌 Works with any opencode provider — anthropic, github-copilot, google-vertex, and any others you've configured
Quick Start
Prerequisites
- opencode installed and at least one provider configured
- Node.js ≥ 20
Install
npm install -g @kud/mcp-opencodeMinimal Claude Code config
opencode:
transport: stdio
command: npx
args:
- -y
- "@kud/mcp-opencode"Installation
Add to ~/.claude/claude_code_config.yml (or your profile config):
opencode:
transport: stdio
command: npx
args:
- -y
- "@kud/mcp-opencode"Then run:
my ai client claude-codeEdit ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"opencode": {
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"]
}
}
}Restart Claude Desktop.
Edit %APPDATA%\Claude\claude_desktop_config.json:
{
"mcpServers": {
"opencode": {
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"]
}
}
}Restart Claude Desktop.
In Cursor settings → MCP → Add server:
{
"opencode": {
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"]
}
}Edit ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"opencode": {
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"]
}
}
}Edit .vscode/mcp.json in your workspace:
{
"servers": {
"opencode": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"]
}
}
}Available Tools
Querying
| Tool | Description |
| ------- | ---------------------------------------------------------- |
| query | Send a prompt to an opencode model and return the response |
Discovery
| Tool | Description |
| ------------- | ----------------------------------------------------------- |
| list_models | List all available models; pass a provider name to filter |
Total: 2 Tools
Model Filtering
Restrict which models are accessible using environment variables:
| Variable | Description | Example |
| ---------------------- | ----------------------------------------------------------- | ------------------------------ |
| OPENCODE_MODEL_ALLOW | Comma-separated allowlist (supports provider/* wildcards) | github-copilot/*,anthropic/* |
| OPENCODE_MODEL_BLOCK | Comma-separated blocklist | anthropic/claude-opus-4-6 |
Both exact model IDs (anthropic/claude-sonnet-4-6) and provider wildcards (github-copilot/*) are supported. If OPENCODE_MODEL_ALLOW is unset, all models are allowed.
Example config with filtering:
{
"mcpServers": {
"opencode": {
"command": "npx",
"args": ["-y", "@kud/mcp-opencode"],
"env": {
"OPENCODE_MODEL_ALLOW": "github-copilot/*,anthropic/*",
"OPENCODE_MODEL_BLOCK": "anthropic/claude-opus-4-6"
}
}
}
}Example Conversations
"What models do I have access to?" → Calls
list_models, returns all configured models grouped by provider.
"List only my Anthropic models." → Calls
list_modelswithprovider: "anthropic".
"Ask GPT-4.1 via Copilot to explain what a monad is." → Calls
querywithmodel: "github-copilot/gpt-4.1".
"Use Claude Sonnet to review this diff and suggest improvements." → Calls
querywithmodel: "anthropic/claude-sonnet-4-6"and the diff as the prompt.
"Get a second opinion from Gemini on this architecture decision." → Calls
querywithmodel: "google-vertex/gemini-2.5-pro".
"What's the default model being used?" →
github-copilot/gpt-4.1— shown in thequerytool description.
Development
Project structure
mcp-opencode/
├── src/
│ ├── index.ts # Server entry point + all tool handlers
│ └── __tests__/
│ └── tools.test.ts # Unit tests
├── dist/ # Compiled output (generated)
├── package.json
├── tsconfig.json
└── vitest.config.tsScripts
| Script | Description |
| --------------------- | --------------------------------------- |
| npm run build | Compile TypeScript to dist/ |
| npm run build:watch | Watch mode |
| npm run dev | Run directly via tsx (no build needed) |
| npm test | Run tests |
| npm run test:watch | Watch mode tests |
| npm run coverage | Test coverage report |
| npm run inspect | Open MCP Inspector against built server |
| npm run inspect:dev | Open MCP Inspector via tsx |
| npm run typecheck | Type-check without emitting |
| npm run clean | Remove dist/ |
Dev workflow
git clone https://github.com/kud/mcp-opencode.git
cd mcp-opencode
npm install
npm run build
npm testUse the local .mcp.json to connect Claude Code directly to your dev build:
# Already present in the repo root:
cat .mcp.json
# { "mcpServers": { "opencode": { "command": "node", "args": ["./dist/index.js"] } } }MCP Inspector
npm run inspectOpens the interactive inspector at http://localhost:5173 — useful for testing tools manually without a full AI client.
How it works
This MCP server acts as a bridge between your AI assistant and opencode:
- On first use, it checks whether the opencode HTTP server is running on
127.0.0.1:4096 - If not, it spawns
opencode servein the background and waits for it to be ready - Each
querycall creates a temporary opencode session, sends the prompt, waits for the response, then deletes the session - Authentication is handled entirely by opencode — configure your providers once in opencode and this MCP inherits them automatically
Troubleshooting
Server not showing in the MCP list
- Ensure opencode is installed:
which opencode - Check Node.js version:
node --version(must be ≥ 20) - Try running manually:
npx @kud/mcp-opencode
"failed to create session" error
- Make sure opencode is installed and at least one provider is configured
- Run
opencode modelsto verify your setup
"The requested model is not supported" error
- The model ID exists in opencode's registry but isn't supported by the provider's API
- Use
list_modelsand pick a working model — e.g.github-copilot/gpt-4.1instead ofgithub-copilot/claude-sonnet-4
Model not in the list
- The model list reflects your opencode configuration, not the full marketplace
- Run
opencode modelsin your terminal to see the same list
MCP Inspector logs
npm run inspectSecurity best practices
- No credentials are stored in or passed through this MCP server
- All authentication is delegated to opencode's own config
- Use
OPENCODE_MODEL_ALLOWto restrict access to specific providers if needed - Never commit
.mcp.jsonor.claude/(both are gitignored)
Tech Stack
| | |
| ----------------- | -------------------------------- |
| Runtime | Node.js ≥ 20 |
| Language | TypeScript 5.x (ESM) |
| Protocol | Model Context Protocol (MCP) 1.0 |
| opencode SDK | @opencode-ai/sdk |
| Schema | Zod |
| Tests | Vitest |
| Module System | ESM ("type": "module") |
Contributing
- Fork the repo
- Create a branch:
git checkout -b feat/my-change - Make your changes and add tests
- Run
npm run build && npm test - Open a pull request
License
MIT — see LICENSE.
Acknowledgments
Built on top of opencode by the SST team and the Model Context Protocol by Anthropic.
Resources
Support
Made with ❤️ for opencode users
⭐ Star this repo if it's useful to you · ↑ Back to top
