npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@igor-olikh/openspec-mcp-server

v2.2.0

Published

An MCP server that connects OpenSpec to AI assistants like Codex, Claude, and Cursor.

Downloads

1,875

Readme

OpenSpec MCP Server (AI Assistant Plugin)

NPM Version NPM Downloads Official Site

Welcome! This is a simple bridge (plugin) that connects OpenSpec to your favorite AI coding assistant (like IBM Bob, Codex, or Claude Desktop).

What is this and why do I need it?

When you want your AI to build a new feature, you usually just type it into the chat. But as projects grow, the AI can forget things, get confused, or write messy code.

OpenSpec is a system that solves this. It forces the AI to create a clear "specification" (a plan) before it writes any code. It organizes your plan into neat folders (proposal, design, tasks) so you can review it.

However, your AI doesn't automatically know how to use OpenSpec. That is what this server does! It gives your AI the "tools" it needs to automatically create these folders, list tasks, and mark them as complete as it writes code for you.


How to Connect Your AI

To use this, you need to tell your AI assistant where this server is located. The setup simply depends on which AI assistant you use.

Option 1: Connecting to IBM Bob

If you are using IBM Bob:

  1. In the IBM Bob IDE, click the three dots next to the gear icon in the upper right corner of the chat window and select MCP servers.
  2. Click Open next to "Global MCPs" to edit your settings file (usually saved at ~/.bob/settings/mcp_settings.json).
  3. Add the openspec server to the mcpServers object:
{
  "mcpServers": {
    "openspec": {
      "command": "npx",
      "args": [
        "-y",
        "@igor-olikh/openspec-mcp-server"
      ]
    }
  }
}
  1. Save the file and restart IBM Bob!

Option 2: Connecting to Codex

Codex has a built-in user interface to easily add these plugins. You can add it quickly by running this terminal command:

codex mcp add openspec-server npx -y @igor-olikh/openspec-mcp-server

Or, manually through the Codex User Interface:

  1. Open the "Connect to a custom MCP" box in Codex.
  2. Name: openspec
  3. Mode: Leave as STDIO
  4. Command to launch: npx
  5. Arguments: Click + Add argument twice and paste exactly:
    • First argument: -y
    • Second argument: @igor-olikh/openspec-mcp-server
  6. Working directory: Leave this blank! (This allows Codex to dynamically use OpenSpec inside whichever project you currently have open).
  7. Save it!

Option 3: Connecting to Claude Desktop App

If you prefer using the Claude Desktop application:

  1. Open your Claude configuration file (usually located at ~/Library/Application Support/Claude/claude_desktop_config.json on Mac).
  2. Add the openspec server to it:
{
  "mcpServers": {
    "openspec": {
      "command": "npx",
      "args": [
        "-y",
        "@igor-olikh/openspec-mcp-server"
      ]
    }
  }
}
  1. Save the file and restart Claude Desktop.

How do I use it?

Once connected, you don't need to do anything technical. You just talk to your AI like normal, but ask it to use OpenSpec!

Example Chat Prompts:

  • "Hey Bob, I want to add a dark mode feature to this application. Please use OpenSpec to propose and validate it."
  • "What is the OpenSpec status of our current project?"
  • "List all the OpenSpec changes we are currently working on."

The AI will automatically use the tools below to handle the rest!


For Developers (Under the Hood)

This server exposes the official @fission-ai/openspec CLI commands as Model Context Protocol (MCP) JSON-RPC tools.

Available AI Tools:

  • openspec_init: Starts OpenSpec in a project.
  • openspec_new_change: Creates a folder for a new feature proposal.
  • openspec_status: Checks how much of the feature is done.
  • openspec_validate: Checks if the code matches the plan.
  • openspec_archive: Marks the feature as 100% completed.
  • openspec_list: Shows all current tasks.
  • openspec_show: Reads a specific task.
  • openspec_update: Updates OpenSpec rules.
  • openspec_instructions: Reads AI instructions for building parts of the plan.
  • openspec_read_file: Reads any spec artifact directly by name and file type — much faster than show, no subprocess overhead.
  • openspec_refresh_cache: Force-refreshes the cached directory listing if files changed outside OpenSpec tools.

Built-in Prompts:

  • openspec_kickoff: A pre-made prompt that steers the AI into a strict spec-driven workflow from the first turn. Automatically injected when supported by the AI assistant.

Development Setup

If you want to modify this server's code:

  1. npm install (Installs dependencies)
  2. npm run build (Compiles the code)
  3. npm run start (Runs the server to test standard input/output)

Recent Changes

  • Built-in MCP Prompts: The openspec_kickoff prompt automatically steers your AI into a strict spec-driven workflow from the first turn.
  • Direct File Readers with In-Memory Cache: New openspec_read_file tool reads spec artifacts directly via the filesystem, bypassing CLI subprocess overhead. An in-memory cache of the directory structure serves list queries in under 1ms. Cache auto-refreshes after any mutating operation.

Upcoming Features (Planned via OpenSpec)

  • Structured JSON Outputs: Replacing raw terminal output with parsed JavaScript objects, preventing the AI from misreading states and reducing hallucinations.
  • Smart Error Handling: Coaching the LLM when OpenSpec validations fail (e.g. intercepting terminal errors to output: "Hey, you forgot the 'Tasks' header in design.md").