npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@kud/mcp-opencode

v1.0.0

Published

MCP server for opencode — query github-copilot models via a persistent opencode server.

Readme

███╗   ███╗ ██████╗██████╗      ██████╗ ██████╗ ███████╗███╗   ██╗ ██████╗ ██████╗ ██████╗ ███████╗
████╗ ████║██╔════╝██╔══██╗    ██╔═══██╗██╔══██╗██╔════╝████╗  ██║██╔════╝██╔═══██╗██╔══██╗██╔════╝
██╔████╔██║██║     ██████╔╝    ██║   ██║██████╔╝█████╗  ██╔██╗ ██║██║     ██║   ██║██║  ██║█████╗
██║╚██╔╝██║██║     ██╔═══╝     ██║   ██║██╔═══╝ ██╔══╝  ██║╚██╗██║██║     ██║   ██║██║  ██║██╔══╝
██║ ╚═╝ ██║╚██████╗██║         ╚██████╔╝██║     ███████╗██║ ╚████║╚██████╗╚██████╔╝██████╔╝███████╗
╚═╝     ╚═╝ ╚═════╝╚═╝          ╚═════╝ ╚═╝     ╚══════╝╚═╝  ╚═══╝ ╚═════╝ ╚═════╝ ╚═════╝ ╚══════╝

TypeScript Node.js MCP npm License: MIT

Query any opencode model from your AI assistant — no API key required.

FeaturesQuick StartInstallationToolsDevelopment


Features

  • 🤖 Query any model — send prompts to anthropic, github-copilot, google-vertex, and more
  • 🔍 Discover models — list all models your opencode is configured for, optionally filtered by provider
  • 🚀 Zero config auth — no API tokens; delegates authentication entirely to opencode
  • Auto-start — spins up the opencode server automatically if it isn't running
  • 🛡️ Allow/block filters — restrict which models are accessible via OPENCODE_MODEL_ALLOW / OPENCODE_MODEL_BLOCK
  • 🔌 Works with any opencode provider — anthropic, github-copilot, google-vertex, and any others you've configured

Quick Start

Prerequisites

  • opencode installed and at least one provider configured
  • Node.js ≥ 20

Install

npm install -g @kud/mcp-opencode

Minimal Claude Code config

opencode:
  transport: stdio
  command: npx
  args:
    - -y
    - "@kud/mcp-opencode"

Installation

Add to ~/.claude/claude_code_config.yml (or your profile config):

opencode:
  transport: stdio
  command: npx
  args:
    - -y
    - "@kud/mcp-opencode"

Then run:

my ai client claude-code

Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "opencode": {
      "command": "npx",
      "args": ["-y", "@kud/mcp-opencode"]
    }
  }
}

Restart Claude Desktop.

Edit %APPDATA%\Claude\claude_desktop_config.json:

{
  "mcpServers": {
    "opencode": {
      "command": "npx",
      "args": ["-y", "@kud/mcp-opencode"]
    }
  }
}

Restart Claude Desktop.

In Cursor settings → MCP → Add server:

{
  "opencode": {
    "command": "npx",
    "args": ["-y", "@kud/mcp-opencode"]
  }
}

Edit ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "opencode": {
      "command": "npx",
      "args": ["-y", "@kud/mcp-opencode"]
    }
  }
}

Edit .vscode/mcp.json in your workspace:

{
  "servers": {
    "opencode": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@kud/mcp-opencode"]
    }
  }
}

Available Tools

Querying

| Tool | Description | | ------- | ---------------------------------------------------------- | | query | Send a prompt to an opencode model and return the response |

Discovery

| Tool | Description | | ------------- | ----------------------------------------------------------- | | list_models | List all available models; pass a provider name to filter |

Total: 2 Tools


Model Filtering

Restrict which models are accessible using environment variables:

| Variable | Description | Example | | ---------------------- | ----------------------------------------------------------- | ------------------------------ | | OPENCODE_MODEL_ALLOW | Comma-separated allowlist (supports provider/* wildcards) | github-copilot/*,anthropic/* | | OPENCODE_MODEL_BLOCK | Comma-separated blocklist | anthropic/claude-opus-4-6 |

Both exact model IDs (anthropic/claude-sonnet-4-6) and provider wildcards (github-copilot/*) are supported. If OPENCODE_MODEL_ALLOW is unset, all models are allowed.

Example config with filtering:

{
  "mcpServers": {
    "opencode": {
      "command": "npx",
      "args": ["-y", "@kud/mcp-opencode"],
      "env": {
        "OPENCODE_MODEL_ALLOW": "github-copilot/*,anthropic/*",
        "OPENCODE_MODEL_BLOCK": "anthropic/claude-opus-4-6"
      }
    }
  }
}

Example Conversations

"What models do I have access to?" → Calls list_models, returns all configured models grouped by provider.

"List only my Anthropic models." → Calls list_models with provider: "anthropic".

"Ask GPT-4.1 via Copilot to explain what a monad is." → Calls query with model: "github-copilot/gpt-4.1".

"Use Claude Sonnet to review this diff and suggest improvements." → Calls query with model: "anthropic/claude-sonnet-4-6" and the diff as the prompt.

"Get a second opinion from Gemini on this architecture decision." → Calls query with model: "google-vertex/gemini-2.5-pro".

"What's the default model being used?"github-copilot/gpt-4.1 — shown in the query tool description.


Development

Project structure

mcp-opencode/
├── src/
│   ├── index.ts              # Server entry point + all tool handlers
│   └── __tests__/
│       └── tools.test.ts     # Unit tests
├── dist/                     # Compiled output (generated)
├── package.json
├── tsconfig.json
└── vitest.config.ts

Scripts

| Script | Description | | --------------------- | --------------------------------------- | | npm run build | Compile TypeScript to dist/ | | npm run build:watch | Watch mode | | npm run dev | Run directly via tsx (no build needed) | | npm test | Run tests | | npm run test:watch | Watch mode tests | | npm run coverage | Test coverage report | | npm run inspect | Open MCP Inspector against built server | | npm run inspect:dev | Open MCP Inspector via tsx | | npm run typecheck | Type-check without emitting | | npm run clean | Remove dist/ |

Dev workflow

git clone https://github.com/kud/mcp-opencode.git
cd mcp-opencode
npm install
npm run build
npm test

Use the local .mcp.json to connect Claude Code directly to your dev build:

# Already present in the repo root:
cat .mcp.json
# { "mcpServers": { "opencode": { "command": "node", "args": ["./dist/index.js"] } } }

MCP Inspector

npm run inspect

Opens the interactive inspector at http://localhost:5173 — useful for testing tools manually without a full AI client.


How it works

This MCP server acts as a bridge between your AI assistant and opencode:

  1. On first use, it checks whether the opencode HTTP server is running on 127.0.0.1:4096
  2. If not, it spawns opencode serve in the background and waits for it to be ready
  3. Each query call creates a temporary opencode session, sends the prompt, waits for the response, then deletes the session
  4. Authentication is handled entirely by opencode — configure your providers once in opencode and this MCP inherits them automatically

Troubleshooting

Server not showing in the MCP list

  • Ensure opencode is installed: which opencode
  • Check Node.js version: node --version (must be ≥ 20)
  • Try running manually: npx @kud/mcp-opencode

"failed to create session" error

  • Make sure opencode is installed and at least one provider is configured
  • Run opencode models to verify your setup

"The requested model is not supported" error

  • The model ID exists in opencode's registry but isn't supported by the provider's API
  • Use list_models and pick a working model — e.g. github-copilot/gpt-4.1 instead of github-copilot/claude-sonnet-4

Model not in the list

  • The model list reflects your opencode configuration, not the full marketplace
  • Run opencode models in your terminal to see the same list

MCP Inspector logs

npm run inspect

Security best practices

  • No credentials are stored in or passed through this MCP server
  • All authentication is delegated to opencode's own config
  • Use OPENCODE_MODEL_ALLOW to restrict access to specific providers if needed
  • Never commit .mcp.json or .claude/ (both are gitignored)

Tech Stack

| | | | ----------------- | -------------------------------- | | Runtime | Node.js ≥ 20 | | Language | TypeScript 5.x (ESM) | | Protocol | Model Context Protocol (MCP) 1.0 | | opencode SDK | @opencode-ai/sdk | | Schema | Zod | | Tests | Vitest | | Module System | ESM ("type": "module") |


Contributing

  1. Fork the repo
  2. Create a branch: git checkout -b feat/my-change
  3. Make your changes and add tests
  4. Run npm run build && npm test
  5. Open a pull request

License

MIT — see LICENSE.


Acknowledgments

Built on top of opencode by the SST team and the Model Context Protocol by Anthropic.


Resources


Support


Made with ❤️ for opencode users

⭐ Star this repo if it's useful to you · ↑ Back to top