ai-cli-mcp
v2.9.0
Published
MCP server for AI CLI tools (Claude, Codex, and Gemini) with background process management
Maintainers
Readme
AI CLI MCP Server
📦 Package Migration Notice: This package was formerly
@mkxultra/claude-code-mcpand has been renamed toai-cli-mcpto reflect its expanded support for multiple AI CLI tools.
An MCP (Model Context Protocol) server that allows running AI CLI tools (Claude, Codex, and Gemini) in background processes with automatic permission handling.
Did you notice that Cursor sometimes struggles with complex, multi-step edits or operations? This server, with its powerful unified run tool, enables multiple AI agents to handle your coding tasks more effectively.
Overview
This MCP server provides tools that can be used by LLMs to interact with AI CLI tools. When integrated with MCP clients, it allows LLMs to:
- Run Claude CLI with all permissions bypassed (using
--dangerously-skip-permissions) - Execute Codex CLI with automatic approval mode (using
--full-auto) - Execute Gemini CLI with automatic approval mode (using
-y) - Support multiple AI models: Claude (sonnet, sonnet[1m], opus, opusplan, haiku), Codex (gpt-5.3-codex, gpt-5.2-codex, gpt-5.1-codex-mini, gpt-5.1-codex-max, gpt-5.2, gpt-5.1, gpt-5.1-codex, gpt-5-codex, gpt-5-codex-mini, gpt-5), and Gemini (gemini-2.5-pro, gemini-2.5-flash, gemini-3.1-pro-preview, gemini-3-pro-preview, gemini-3-flash-preview)
- Manage background processes with PID tracking
- Parse and return structured outputs from both tools
Usage Example (Advanced Parallel Processing)
You can instruct your main agent to run multiple tasks in parallel like this:
Launch agents for the following 3 tasks using acm mcp run:
- Refactor
src/backendcode usingsonnet- Create unit tests for
src/frontendusinggpt-5.2-codex- Update docs in
docs/usinggemini-2.5-proWhile they run, please update the TODO list. Once done, use the
waittool to wait for all completions and report the results together.
Usage Example (Context Caching & Sharing)
You can reuse heavy context (like large codebases) using session IDs to save costs while running multiple tasks.
- First, use
acm mcp runwithopusto read all files insrc/and understand the project structure.- Use the
waittool to wait for completion and retrieve thesession_idfrom the result.- Using that
session_id, run the following two tasks in parallel withacm mcp run:
- Create refactoring proposals for
src/utilsusingsonnet- Add architecture documentation to
README.mdusinggpt-5.2-codex- Finally,
waitagain to combine both results.
Benefits
- True Async Multitasking: Agent execution happens in the background, returning control immediately. The calling AI can proceed with the next task or invoke another agent without waiting for completion.
- CLI in CLI (Agent in Agent): Directly invoke powerful CLI tools like Claude Code or Codex from any MCP-supported IDE or CLI. This enables broader, more complex system operations and automation beyond host environment limitations.
- Freedom from Model/Provider Constraints: Freely select and combine the "strongest" or "most cost-effective" models from Claude, Codex (GPT), and Gemini without being tied to a specific ecosystem.
Prerequisites
The only prerequisite is that the AI CLI tools you want to use are locally installed and correctly configured.
- Claude Code:
claude doctorpasses, and execution with--dangerously-skip-permissionsis approved (you must run it manually once to login and accept terms). - Codex CLI (Optional): Installed and initial setup (login etc.) completed.
- Gemini CLI (Optional): Installed and initial setup (login etc.) completed.
Installation & Usage
The recommended way to use this server is by installing it by using npx.
Using npx in your MCP configuration:
"ai-cli-mcp": {
"command": "npx",
"args": [
"-y",
"ai-cli-mcp@latest"
]
},Using Claude CLI mcp add command:
claude mcp add ai-cli '{"name":"ai-cli","command":"npx","args":["-y","ai-cli-mcp@latest"]}'Important First-Time Setup
For Claude CLI:
Before the MCP server can use Claude, you must first run the Claude CLI manually once with the --dangerously-skip-permissions flag, login and accept the terms.
npm install -g @anthropic-ai/claude-code
claude --dangerously-skip-permissionsFollow the prompts to accept. Once this is done, the MCP server will be able to use the flag non-interactively.
For Codex CLI:
For Codex, ensure you're logged in and have accepted any necessary terms:
codex loginFor Gemini CLI:
For Gemini, ensure you're logged in and have configured your credentials:
gemini auth loginmacOS might ask for folder permissions the first time any of these tools run. If the first run fails, subsequent runs should work.
Connecting to Your MCP Client
After setting up the server, add the configuration to your MCP client's settings file (e.g., mcp.json for Cursor, mcp_config.json for Windsurf).
If the file doesn't exist, create it and add the ai-cli-mcp configuration.
Tools Provided
This server exposes the following tools:
run
Executes a prompt using Claude CLI, Codex CLI, or Gemini CLI. The appropriate CLI is automatically selected based on the model name.
Arguments:
prompt(string, optional): The prompt to send to the AI agent. Eitherpromptorprompt_fileis required.prompt_file(string, optional): Path to a file containing the prompt. Eitherpromptorprompt_fileis required. Can be absolute path or relative toworkFolder.workFolder(string, required): The working directory for the CLI execution. Must be an absolute path. Models:- Ultra Aliases:
claude-ultra,codex-ultra(defaults to high-reasoning),gemini-ultra - Claude:
sonnet,sonnet[1m],opus,opusplan,haiku - Codex:
gpt-5.3-codex,gpt-5.2-codex,gpt-5.1-codex-mini,gpt-5.1-codex-max,gpt-5.2,gpt-5.1,gpt-5 - Gemini:
gemini-2.5-pro,gemini-2.5-flash,gemini-3.1-pro-preview,gemini-3-pro-preview,gemini-3-flash-preview reasoning_effort(string, optional): Codex only. Setsmodel_reasoning_effort(allowed: "low", "medium", "high", "xhigh").session_id(string, optional): Optional session ID to resume a previous session. Supported for: haiku, sonnet, opus, gemini-2.5-pro, gemini-2.5-flash, gemini-3.1-pro-preview, gemini-3-pro-preview, gemini-3-flash-preview.
wait
Waits for multiple AI agent processes to complete and returns their combined results. Blocks until all specified PIDs finish or a timeout occurs.
Arguments:
pids(array of numbers, required): List of process IDs to wait for (returned by theruntool).timeout(number, optional): Maximum wait time in seconds. Defaults to 180 (3 minutes).
list_processes
Lists all running and completed AI agent processes with their status, PID, and basic info.
get_result
Gets the current output and status of an AI agent process by PID.
Arguments:
pid(number, required): The process ID returned by theruntool.
kill_process
Terminates a running AI agent process by PID.
Arguments:
pid(number, required): The process ID to terminate.
Troubleshooting
- "Command not found" (claude-code-mcp): If installed globally, ensure the npm global bin directory is in your system's PATH. If using
npx, ensurenpxitself is working. - "Command not found" (claude or ~/.claude/local/claude): Ensure the Claude CLI is installed correctly. Run
claude/doctoror check its documentation. - Permissions Issues: Make sure you've run the "Important First-Time Setup" step.
- JSON Errors from Server: If
MCP_CLAUDE_DEBUGistrue, error messages or logs might interfere with MCP's JSON parsing. Set tofalsefor normal operation. - ESM/Import Errors: Ensure you are using Node.js v20 or later.
Contributing
For development setup, testing, and contribution guidelines, see the Development Guide.
Advanced Configuration (Optional)
Normally not required, but useful for customizing CLI paths or debugging.
CLAUDE_CLI_NAME: Override the Claude CLI binary name or provide an absolute path (default:claude)CODEX_CLI_NAME: Override the Codex CLI binary name or provide an absolute path (default:codex)GEMINI_CLI_NAME: Override the Gemini CLI binary name or provide an absolute path (default:gemini)MCP_CLAUDE_DEBUG: Enable debug logging (set totruefor verbose output)
CLI Name Specification:
- Command name only:
CLAUDE_CLI_NAME=claude-custom - Absolute path:
CLAUDE_CLI_NAME=/path/to/custom/claudeRelative paths are not supported.
Example with custom CLI binaries:
"ai-cli-mcp": {
"command": "npx",
"args": [
"-y",
"ai-cli-mcp@latest"
],
"env": {
"CLAUDE_CLI_NAME": "claude-custom",
"CODEX_CLI_NAME": "codex-custom"
}
},License
MIT
