@dr-yaml/mcp
v0.0.2
Published
MCP server exposing dr-yaml's read-only pipeline tools (load, describe, simulate, lint, migrate) to AI coding assistants over stdio.
Maintainers
Readme
@dr-yaml/mcp — Dr. Yaml MCP server
Read-only MCP server that exposes Dr. Yaml's pipeline intelligence
(load, describe, simulate, lint, migrate) to AI coding
assistants over stdio. Thin wrapper over
@dr-yaml/core — every tool invokes the same
APIs the web app and CLI already ship on top of.
Tools
| Tool | Input | Returns |
| -------------------- | -------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------- |
| load_pipeline | { path? } \| { url? } \| { content? } (exactly one) | { platform, workflows: [{id, name, jobCount}], warnings } |
| describe_workflow | { path, workflowId? } | { markdown, summary } — Markdown + PipelineSummary |
| simulate | { path, workflowId?, event?, branch?, availableSecrets? } | per-job { id, status, reason, durationMin, costUsd, … } + warnings + rollups |
| lint | { path } | { issues: LintIssue[], errorCount, warningCount, infoCount } |
| migrate | { path, targetPlatform: "gha" \| "circleci" \| "bitrise" } | { yaml, lossReport, unsupportedFeatures, reparseWarnings } (deterministic) |
All tools are read-only. Nothing writes to disk or calls out to the
network (except load_pipeline with a url). LLM-assisted migration
(Unit 18b) is intentionally not exposed here — it requires
per-suggestion user confirmation and lives in the web UI / CLI.
Resources
| URI template | Returns |
| ------------------------ | -------------------------------------------------- |
| pipeline://<abs-path> | Full Universal IR JSON for a local YAML config. |
Transport
stdio only. No HTTP, no SSE — every MCP client that ships today (Claude Desktop, Cursor, Continue, the Dr. Yaml CLI) already speaks stdio, and stdio sidesteps the auth and port-management surface an HTTP transport would require.
Install
npm install -g @dr-yaml/mcp
# or run on demand without installing:
npx -y @dr-yaml/mcpThe binary is dr-yaml-mcp; the @dr-yaml/mcp package is the only
thing you need — its internal @dr-yaml/core / @dr-yaml/ir /
@dr-yaml/simulator / @dr-yaml/adapter-* deps come in
automatically.
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) / %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"dr-yaml": {
"command": "npx",
"args": ["-y", "@dr-yaml/mcp"]
}
}
}Restart Claude Desktop. The five tools should appear in the tool
picker; Claude can now call e.g. simulate on a workflow file path
it was given in conversation.
Cursor
Open Cursor Settings → Features → Model Context Protocol, click + Add new MCP server, and provide:
- Name:
dr-yaml - Type:
stdio - Command:
npx - Arguments:
-y @dr-yaml/mcp
Or as JSON (~/.cursor/mcp.json):
{
"mcpServers": {
"dr-yaml": {
"command": "npx",
"args": ["-y", "@dr-yaml/mcp"]
}
}
}Reload Cursor; the tools appear under the dr-yaml namespace.
Continue (VS Code / JetBrains)
In ~/.continue/config.json:
{
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@dr-yaml/mcp"]
}
}
]
}
}Running from source (contributors)
bun install
# From the repo root:
bun run apps/mcp/src/index.ts
# Or via the CLI wrapper (identical behavior):
bun run apps/cli/src/index.ts mcpOnce installed globally via bun link, the binary is dr-yaml-mcp.
Example exchange
Agent: I'll check what this workflow will do on a push to main.
→ simulate({ path: "/repo/.github/workflows/ci.yml", event: "push", branch: "main" })
← { jobs: [{ id: "build", status: "will-run", durationMin: 3.2, costUsd: 0.025 }, …],
warnings: [], totalDurationMin: 3.2, totalCostUsd: 0.025 }
Agent: The `build` job runs unconditionally on push to main (3.2 min, ≈$0.025).
No warnings, no skipped jobs. Anything else you want to check?Verification
The full stdio contract is gated by
scripts/gate/unit-19.sh, which
spawns the server, walks it through initialize → tools/list →
tools/call for every tool → resources/read → shutdown, and asserts
the shape of each response. Run it with bash scripts/gate/unit-19.sh
from the repo root.
