npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@weibaohui/mcp2cli

v0.4.1

Published

CLI tool for interacting with MCP (Model Context Protocol) Servers

Readme

mcp2cli

A CLI-native MCP client that loads tool schemas on-demand and resolves discover-then-call into a single invocation — keeping tool definitions out of your context window.

One bash command → tool discovery + schema resolution + precise invocation. No persistent tool definitions, no token waste.

Go Version License npm

Why mcp2cli?

When an LLM uses MCP tools directly, every call carries heavy overhead:

  • Tool discovery: list tools → read schemas → understand parameters → construct call → parse response
  • Each MCP tool definition (schema, descriptions, types) stays in context, consuming tokens for the entire conversation
  • A single "search GitHub and summarize" task can burn thousands of tokens just on protocol overhead

mcp2cli compresses that into one bash command:

# Direct MCP: 3 rounds, ~2000+ tokens of context
1. list_tools(server)          → get all tool schemas
2. get_tool_details(server, tool) → read parameter definitions
3. call_tool(server, tool, args)  → get result

# mcp2cli: 1 bash call, ~200 tokens
mcp openDeepWiki get_repo_structure repoOwner=github repoName=vscode

Result: 80-90% fewer tokens per tool interaction.

Quick Start

# Install
npm install -g @weibaohui/mcp2cli

# List servers
mcp

# Explore tools on a server
mcp openDeepWiki

# View tool details + param examples
mcp openDeepWiki get_repo_structure

# Call a tool
mcp openDeepWiki get_repo_structure repoOwner=github repoName=vscode

How It Saves Tokens

Before: Direct MCP (verbose)

Every MCP interaction requires the LLM to maintain tool schemas in context:

// Tool schema alone = ~500 tokens, stays in EVERY message
{
  "name": "get_repo_structure",
  "description": "Retrieves the complete file and directory structure of a Git repository...",
  "inputSchema": {
    "type": "object",
    "properties": {
      "repoOwner": {"type": "string", "description": "..."},
      "repoName": {"type": "string", "description": "..."},
      ...
    },
    "required": ["repoOwner", "repoName"]
  }
}

Multiply this by every tool on every server — a server with 20 tools = ~10,000 tokens permanently in context.

After: mcp2cli (lean)

The LLM only needs a short bash command. No schemas, no tool definitions, no protocol overhead:

# Token cost: just the command string (~30 tokens)
mcp openDeepWiki get_repo_structure repoOwner=github repoName=vscode
// Output is concise JSON (~100 tokens)
{
  "success": true,
  "data": { "server": "openDeepWiki", "method": "get_repo_structure", "result": "..." },
  "meta": { "timestamp": "2026-03-28T10:00:00Z", "version": "v0.3.0" }
}

Token Comparison

| Scenario | Direct MCP | mcp2cli | Saving | |----------|-----------|---------|--------| | Discover 1 tool | ~500 tokens (schema in context) | ~100 tokens (one bash call) | 80% | | Call 1 tool | ~300 tokens (schema + call overhead) | ~130 tokens (command + output) | 57% | | 10-tool server in context | ~10,000 tokens (persistent) | 0 tokens (loaded on demand) | 100% | | Full workflow (discover + call) | ~2,000 tokens | ~230 tokens | 89% |

Usage in Claude Code / Cursor / Windsurf

Add to your project's MCP config (e.g. .mcp/config.json or ~/.config/mcp/config.json):

{
  "mcpServers": {
    "openDeepWiki": {
      "url": "https://opendeepwiki.k8m.site/mcp/streamable"
    }
  }
}

Then in your AI tool, just run bash commands:

# The LLM can explore and call tools in one step
$ mcp openDeepWiki list_repositories limit=3

# No need to load tool schemas — just call
$ mcp openDeepWiki get_repo_structure repoOwner=weibaohui repoName=mcp2cli

Argument Format

Simple Arguments (key=value)

# Simple key=value (string by default)
mcp server tool name=John age=30

# Typed key:type=value (for precision)
mcp server tool name:string=John age:number=30 enabled:bool=true

Supported types: string, number, int, float, bool

YAML Input (for complex parameters)

For complex parameters (objects, arrays, multi-line text), use YAML format:

# Inline YAML with --yaml or -y
mcp server tool --yaml 'name: John details: {age: 30, city: NYC}'
mcp server tool -y 'tags: [dev, ops] enabled: true'

# Multi-line YAML
mcp server tool --yaml 'limit: 10
offset: 5
status: "ready"'

# Read from file (like kubectl apply -f)
mcp server tool -f params.yaml

# Pipe YAML to stdin
cat params.yaml | mcp server tool

Priority: -f > --yaml/-y > stdin pipe > key=value

Output Format

Control output format with --output / -o:

# Pretty JSON (default, human-readable with indentation)
mcp server tool

# YAML output
mcp --output yaml server tool

# Compact JSON (no indentation, good for piping)
mcp -o compact server tool

| Format | Description | Use Case | |--------|-------------|----------| | pretty | Pretty-printed JSON (default) | Default, debugging | | compact | Compact JSON (no indentation) | Piping, scripts | | yaml | YAML format | Config files, readability |

Installation

npm (recommended)

npm install -g @weibaohui/mcp2cli

Supports Linux, macOS, Windows on amd64/arm64. mcp command ready immediately.

Binary download

Download from GitHub Releases:

# macOS / Linux
mv mcp2cli-darwin-arm64 mcp && chmod +x mcp && sudo mv mcp /usr/local/bin/

# Windows
ren mcp2cli-windows-amd64.exe mcp.exe

Command Reference

| Command | Description | |---------|-------------| | mcp | List configured servers | | mcp <server> | List tools on a server | | mcp <server> <tool> | Show tool details + param examples | | mcp <server> <tool> key=value ... | Call a tool |

Global Flags

| Flag | Short | Description | Default | |------|-------|-------------|---------| | --output | -o | Output format (pretty|compact|yaml) | pretty | | --yaml | -y | YAML parameters (inline) | | | --file | -f | YAML file with parameters | | | --stream | -s | Enable streaming output | false |

License

MIT License - see LICENSE for details.