@dici1435/spec-review-mcp
v0.1.7
Published
LLM-powered spec compliance review MCP server for dici-spec projects.
Readme
@dici1435/spec-review-mcp
LLM-powered spec compliance review MCP server for dici-spec projects. Reviews code against spec documents and constraints, returning structured findings with category, severity, file location, and description.
Installation
npm install -g @dici1435/spec-review-mcp
# or run directly
npx @dici1435/spec-review-mcpAdded to a project via dici-spec add spec-review, which prompts for your LLM API key.
Tools
| Tool | Description |
|---|---|
| review_code | Review code files against spec documents for compliance. Accepts spec paths, code paths, and optional constraints. Returns structured findings. |
Finding Structure
Each finding includes:
| Field | Values |
|---|---|
| category | spec-compliance, missing-implementation, deviation, architecture, correctness |
| severity | blocker, major, minor, nit |
| file | Path to the file with the finding |
| line | Line number (when applicable) |
| description | Detailed explanation of the finding |
Environment Variables
| Variable | Required | Default | Description |
|---|---|---|---|
| LLM_API_KEY | Yes | — | API key for the LLM provider (OpenAI or Anthropic) |
| LLM_BASE_URL | No | https://api.openai.com/v1 | Base URL for the LLM API. Set to https://api.anthropic.com/v1 to use Anthropic. |
| LLM_MODEL | No | gpt-4o (OpenAI) / claude-3-5-sonnet-20241022 (Anthropic) | Model to use for review |
| PROJECT_ROOT | No | process.cwd() | Root directory of the project |
| SPECS_DIR | No | agents/specs | Directory containing spec files (relative to PROJECT_ROOT) |
| CONSTRAINTS_DIR | No | agents/constraints | Directory containing constraint files (relative to PROJECT_ROOT) |
The provider is auto-detected from LLM_BASE_URL — no separate provider flag needed. If the URL contains anthropic.com, the Anthropic Messages API is used. Everything else uses the OpenAI Chat Completions API.
Provider Configuration
OpenAI (default)
"dici-spec-review": {
"command": "npx",
"args": ["@dici1435/spec-review-mcp"],
"env": {
"LLM_API_KEY": "sk-...",
"LLM_MODEL": "gpt-4o"
}
}Anthropic
"dici-spec-review": {
"command": "npx",
"args": ["@dici1435/spec-review-mcp"],
"env": {
"LLM_API_KEY": "sk-ant-...",
"LLM_BASE_URL": "https://api.anthropic.com/v1",
"LLM_MODEL": "claude-sonnet-4-5"
}
}The provider is detected automatically from LLM_BASE_URL. When using Anthropic, the server uses the /v1/messages endpoint with x-api-key authentication and passes the system prompt as Anthropic's top-level system parameter.
OpenAI-compatible providers (OpenRouter, Azure, etc.)
Any provider with an OpenAI-compatible /chat/completions endpoint works by setting LLM_BASE_URL:
"LLM_BASE_URL": "https://openrouter.ai/api/v1",
"LLM_MODEL": "anthropic/claude-sonnet-4-5"How It Works
- Loads spec documents from
SPECS_DIRand constraints fromCONSTRAINTS_DIR - Reads the specified code files
- Assembles a review prompt with system instructions, spec content, code content, and constraint context
- Sends the prompt to the configured LLM (auto-detecting OpenAI vs. Anthropic from
LLM_BASE_URL) - Parses the response into structured findings
Transport
Stdio (stdin/stdout). Designed to be launched by AI agent IDEs as a child process.
License
MIT
