@h1deya/mcp-client-cli
v0.4.2
Published
Simple MCP Client to quickly test and explore MCP servers from the command line
Maintainers
Readme
Simple MCP Client to Explore MCP Servers

Quickly test and explore MCP servers from the command line!
A simple, text-based CLI client for Model Context Protocol (MCP) servers built with LangChain and TypeScript.
This tool automatically adjusts the schema for LLM compatibility, which can help some failing MCP servers run successfully.
Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.
Internally it uses LangChain Agent and
a utility function convertMcpToLangchainTools() from
@h1deya/langchain-mcp-tools.
This function performs the aforementioned MCP tools schema transformations for LLM compatibility.
See this page
for details.
A Python equivalent of this utility is available here
Prerequisites
- Node.js 18+
- [optional]
uv(uvx) installed to run Python-based local (stdio) MCP servers - LLM API key(s) from OpenAI, Anthropic, Google AI Studio (for GenAI/Gemini), xAI, Cerebras, and/or Groq, as needed
Quick Start
Install
mcp-client-clitool. This can take up to a few minutes to complete:npm install -g @h1deya/mcp-client-cliConfigure LLM and MCP Servers settings via the configuration file,
llm_mcp_config.json5code llm_mcp_config.json5The following is a simple configuration for quick testing:
{ "llm": { "provider": "openai", "model": "gpt-5-mini" // "provider": "anthropic", "model": "claude-haiku-4-5" // "provider": "google_genai", "model": "gemini-2.5-flash" // "provider": "xai", "model": "grok-4-1-fast-non-reasoning" // "provider": "cerebras", "model": "gpt-oss-120b" // "provider": "groq", "model": "openai/gpt-oss-20b" }, "mcp_servers": { "us-weather": { // US weather only "command": "npx", "args": ["-y", "@h1deya/mcp-server-weather"] }, }, "example_queries": [ "Tell me how LLMs work in a few sentences", "Are there any weather alerts in California?", ], }Set up API keys
echo "ANTHROPIC_API_KEY=sk-ant-... OPENAI_API_KEY=sk-proj-... GOOGLE_API_KEY=AI... XAI_API_KEY=xai-... CEREBRAS_API_KEY=csk-... GROQ_API_KEY=gsk_..." > .env code .envRun the tool
mcp-client-cliBy default, it reads the configuration file,
llm_mcp_config.json5, from the current directory.
Then, it applies the environment variables specified in the.envfile, as well as the ones that are already defined.
Features
- Easy setup: Works out of the box with popular MCP servers
- Flexible configuration: JSON5 config with environment variable support
- Multiple LLM providers: OpenAI, Anthropic, Google (GenAI)
- Schema Compatibility Support: Automatically adjusts tools schema for LLM compatibility, which can help some failing MCP servers run successfully.
See this page
for details.
If you want to disable the schema trnaformations, add"schema_transformations": false,to the top level of the config file. - Command & URL servers: Support for both local and remote MCP servers.
Usemcp-remoteto connect to remote servers with OAuth (see the end of the configuration example below). - Real-time logging: Live stdio MCP server logs with customizable log directory
- Interactive testing: Example queries for the convenience of repeated testing
Limitations
- Tool Return Types: Currently, only text results of tool calls are supported.
It uses LangChain's
response_format: 'content'(the default) internally, which only supports text strings. While MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content. - MCP Features: Only MCP Tools are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented.
Usage
Basic Usage
mcp-client-cliBy default, it reads the configuration file, llm_mcp_config.json5, from the current directory.
Then, it applies the environment variables specified in the .env file,
as well as the ones that are already defined.
It outputs local MCP server logs to the current directory.
With Options
# Specify the config file to use
mcp-client-cli --config my-config.json5
# Store local (stdio) MCP server logs in specific directory
mcp-client-cli --log-dir ./logs
# Enable verbose logging
mcp-client-cli --verbose
# Show help
mcp-client-cli --helpSupported LLM Providers
- OpenAI:
gpt-5-mini,gpt-5.2, etc. - Anthropic:
claude-haiku-4-5,claude-3-5-haiku-latest, etc. - Google (GenAI):
gemini-2.5-flash,gemini-3-flash-preview, etc. - xAI:
grok-3-mini,grok-4-1-fast-non-reasoning, etc. - Cerebras:
gpt-oss-120b, etc. - Groq:
openai/gpt-oss-20b,openai/gpt-oss-120b, etc.
Configuration
Create a llm_mcp_config.json5 file:
- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop,
with one difference: the key name
mcpServershas been changed tomcp_serversto follow the snake_case convention commonly used in JSON configuration files. - The file format is JSON5, where comments and trailing commas are allowed.
- The format is further extended to replace
${...}notations with the values of corresponding environment variables. - Keep all the credentials and private info in the
.envfile and refer to them with${...}notation as needed
{
"llm": {
"provider": "openai", "model": "gpt-5-mini"
// "provider": "anthropic", "model": "claude-haiku-4-5"
// "provider": "google_genai", "model": "gemini-2.5-flash"
// "provider": "xai", "model": "grok-4-1-fast-non-reasoning"
// "provider": "cerebras", "model": "gpt-oss-120b"
// "provider": "groq", "model": "openai/gpt-oss-20b"
},
// To disable the automatic schema transformations, uncomment the following line.
// See this for details about the schema transformations:
// https://github.com/hideya/langchain-mcp-tools-ts/blob/main/README.md#llm-provider-schema-compatibility
//
// "schema_transformations": false,
"example_queries": [
"Read and briefly summarize the LICENSE file in the current directory",
"Fetch the raw HTML content from bbc.com and tell me the titile",
// "Search for 'news in California' and show the first hit",
// "Tell me about my default GitHub profile",
// "Tell me about my default Notion account",
],
"mcp_servers": {
// Local MCP server that uses `npx`
// https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"." // path to a directory to allow access to
]
},
// Local MCP server that uses `uvx`
// https://pypi.org/project/mcp-server-fetch/
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch"
]
},
// Embedding the value of an environment variable
// https://www.npmjs.com/package/@modelcontextprotocol/server-brave-search
"brave-search": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-brave-search"
],
"env": {
"BRAVE_API_KEY": "${BRAVE_API_KEY}"
}
},
// Example of remote MCP server authentication via Authorization header
// https://github.com/github/github-mcp-server?tab=readme-ov-file#remote-github-mcp-server
"github": {
// To avoid auto protocol fallback, specify the protocol explicitly when using authentication
"type": "http",
"url": "https://api.githubcopilot.com/mcp/",
"headers": {
"Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}"
}
},
// For remote MCP servers that require OAuth, consider using "mcp-remote"
"notion": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.notion.com/mcp"],
},
}
}Environment Variables
Create a .env file for API keys:
OPENAI_API_KEY=sk-ant-...
ANTHROPIC_API_KEY=sk-proj-...
GOOGLE_API_KEY=AI...
CEREBRAS_API_KEY=csk-...
GROQ_API_KEY=gsk_...
# Other services as needed
GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
BRAVE_API_KEY=BSA...Popular MCP Servers to Try
There are quite a few useful MCP servers already available:
Troubleshooting
- Make sure your configuration and .env files are correct, especially the spelling of the API keys
- Check the local MCP server logs
- Use
--verboseflag to view the detailed logs - Refer to Debugging Section in MCP documentation
Building from Source
See README_DEV.md for details.
Change Log
Can be found here
License
MIT License - see LICENSE file for details.
Contributing
Issues and pull requests welcome! This tool aims to make MCP server testing as simple as possible.
