@nayagamez/codex-cli-mcp
v0.4.1
Published
MCP server wrapping Codex CLI (codex exec) as tools
Downloads
1,127
Readme
Bridge OpenAI Codex CLI to any MCP client
English | 한국어
npm · GitHub · Issues
Overview
An MCP (Model Context Protocol) server that wraps OpenAI Codex CLI as tools. It enables MCP clients like Claude Desktop, Cursor, and Windsurf to run Codex CLI sessions in headless mode.
Prerequisites
1. Install Codex CLI
Install Codex CLI (docs) and make sure it is available in your PATH:
# npm
npm install -g @openai/codex
# Homebrew (macOS)
brew install --cask codexOr download the binary from GitHub Releases.
2. Authenticate
Option A — ChatGPT Login (Recommended)
Run codex and select "Sign in with ChatGPT". Requires a Plus, Pro, Team, Edu, or Enterprise plan.
Option B — API Key
For headless / CI environments:
export OPENAI_API_KEY="your-api-key"See the Codex Authentication docs for more details.
Tools
See Codex Models for available models.
codex
Start a new Codex CLI session.
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| prompt | string | Yes | The prompt to send to Codex |
| model | string | No | Model name override |
| effort | enum | No | Reasoning effort: medium, high, xhigh (auto-selected by task complexity) |
| sandbox | enum | No | read-only, workspace-write, or danger-full-access |
| cwd | string | No | Working directory for the session |
| profile | string | No | Configuration profile from config.toml |
| config | object | No | Config overrides as key-value pairs |
| timeout | number | No | Timeout in ms (default: 600000 = 10 min) |
codex-reply
Continue an existing Codex CLI session.
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| prompt | string | Yes | The follow-up prompt |
| threadId | string | Yes | Thread ID from a previous codex call |
| model | string | No | Model name override |
| effort | enum | No | Reasoning effort: medium, high, xhigh (auto-selected by task complexity) |
| config | object | No | Config overrides as key-value pairs |
| timeout | number | No | Timeout in ms (default: 600000 = 10 min) |
Setup
For Humans
Copy the prompt below and paste it into your LLM agent — it will install and configure everything automatically:
Install and configure @nayagamez/codex-cli-mcp by following: https://raw.githubusercontent.com/nayagamez/codex-cli-mcp/main/docs/guide/installation.mdOr set it up manually — see Manual Setup below.
For LLM Agents
curl -s https://raw.githubusercontent.com/nayagamez/codex-cli-mcp/main/docs/guide/installation.mdManual Setup
Add to your claude_desktop_config.json:
{
"mcpServers": {
"codex-cli-mcp": {
"command": "npx",
"args": ["-y", "@nayagamez/codex-cli-mcp"]
}
}
}Add to your MCP settings:
{
"mcpServers": {
"codex-cli-mcp": {
"command": "npx",
"args": ["-y", "@nayagamez/codex-cli-mcp"]
}
}
}claude mcp add codex-cli-mcp -- npx -y @nayagamez/codex-cli-mcpProgress Notifications
The server sends MCP progress notifications in real-time as Codex processes your request. This lets MCP clients know the server is alive and working, not hanging.
Progress messages include:
[5s] Session started (thread: ...)— session initialized[12s] Command executed: npm test— a command was run[18s] Message: Refactoring the auth module...— agent reasoning[25s] Turn completed— turn finished
Idle-based Timeout
The timeout is idle-based, not absolute. The timer resets every time the server receives an event from Codex. This means long-running tasks with continuous activity will never timeout, while truly stuck processes will be killed after the configured idle period.
- Default idle timeout: 10 minutes
- Override per-call via
timeoutparameter, or globally viaCODEX_TIMEOUT_MS
Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| CODEX_CLI_PATH | codex | Path to the Codex CLI binary |
| CODEX_TIMEOUT_MS | 600000 (10 min) | Idle timeout for Codex process |
| CODEX_MCP_DEBUG | (unset) | Set to enable debug logging to stderr |
How It Works
MCP Client → Tool Call (codex / codex-reply)
→ Spawn `codex exec --json --full-auto` as subprocess
→ Stream JSONL events from stdout
→ Send progress notifications back to client
→ Return formatted results when done- The MCP client sends a tool call (
codexorcodex-reply) - The server spawns Codex CLI with
--jsonand--full-autoflags - The prompt is passed via stdin
- JSONL events are streamed and parsed in real-time
- Progress notifications are sent to the client on each event (idle timer resets)
- Results (messages, commands, errors, token usage) are formatted as markdown and returned
License
MIT
