npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@nayagamez/codex-cli-mcp

v0.4.1

Published

MCP server wrapping Codex CLI (codex exec) as tools

Downloads

1,127

Readme

Bridge OpenAI Codex CLI to any MCP client

English | 한국어

npm · GitHub · Issues

npm version license GitHub stars


Overview

An MCP (Model Context Protocol) server that wraps OpenAI Codex CLI as tools. It enables MCP clients like Claude Desktop, Cursor, and Windsurf to run Codex CLI sessions in headless mode.

Prerequisites

1. Install Codex CLI

Install Codex CLI (docs) and make sure it is available in your PATH:

# npm
npm install -g @openai/codex

# Homebrew (macOS)
brew install --cask codex

Or download the binary from GitHub Releases.

2. Authenticate

Option A — ChatGPT Login (Recommended)

Run codex and select "Sign in with ChatGPT". Requires a Plus, Pro, Team, Edu, or Enterprise plan.

Option B — API Key

For headless / CI environments:

export OPENAI_API_KEY="your-api-key"

See the Codex Authentication docs for more details.

Tools

See Codex Models for available models.

codex

Start a new Codex CLI session.

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | prompt | string | Yes | The prompt to send to Codex | | model | string | No | Model name override | | effort | enum | No | Reasoning effort: medium, high, xhigh (auto-selected by task complexity) | | sandbox | enum | No | read-only, workspace-write, or danger-full-access | | cwd | string | No | Working directory for the session | | profile | string | No | Configuration profile from config.toml | | config | object | No | Config overrides as key-value pairs | | timeout | number | No | Timeout in ms (default: 600000 = 10 min) |

codex-reply

Continue an existing Codex CLI session.

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | prompt | string | Yes | The follow-up prompt | | threadId | string | Yes | Thread ID from a previous codex call | | model | string | No | Model name override | | effort | enum | No | Reasoning effort: medium, high, xhigh (auto-selected by task complexity) | | config | object | No | Config overrides as key-value pairs | | timeout | number | No | Timeout in ms (default: 600000 = 10 min) |

Setup

For Humans

Copy the prompt below and paste it into your LLM agent — it will install and configure everything automatically:

Install and configure @nayagamez/codex-cli-mcp by following: https://raw.githubusercontent.com/nayagamez/codex-cli-mcp/main/docs/guide/installation.md

Or set it up manually — see Manual Setup below.

For LLM Agents

curl -s https://raw.githubusercontent.com/nayagamez/codex-cli-mcp/main/docs/guide/installation.md

Manual Setup

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "codex-cli-mcp": {
      "command": "npx",
      "args": ["-y", "@nayagamez/codex-cli-mcp"]
    }
  }
}

Add to your MCP settings:

{
  "mcpServers": {
    "codex-cli-mcp": {
      "command": "npx",
      "args": ["-y", "@nayagamez/codex-cli-mcp"]
    }
  }
}
claude mcp add codex-cli-mcp -- npx -y @nayagamez/codex-cli-mcp

Progress Notifications

The server sends MCP progress notifications in real-time as Codex processes your request. This lets MCP clients know the server is alive and working, not hanging.

Progress messages include:

  • [5s] Session started (thread: ...) — session initialized
  • [12s] Command executed: npm test — a command was run
  • [18s] Message: Refactoring the auth module... — agent reasoning
  • [25s] Turn completed — turn finished

Idle-based Timeout

The timeout is idle-based, not absolute. The timer resets every time the server receives an event from Codex. This means long-running tasks with continuous activity will never timeout, while truly stuck processes will be killed after the configured idle period.

  • Default idle timeout: 10 minutes
  • Override per-call via timeout parameter, or globally via CODEX_TIMEOUT_MS

Environment Variables

| Variable | Default | Description | |----------|---------|-------------| | CODEX_CLI_PATH | codex | Path to the Codex CLI binary | | CODEX_TIMEOUT_MS | 600000 (10 min) | Idle timeout for Codex process | | CODEX_MCP_DEBUG | (unset) | Set to enable debug logging to stderr |

How It Works

MCP Client  →  Tool Call (codex / codex-reply)
            →  Spawn `codex exec --json --full-auto` as subprocess
            →  Stream JSONL events from stdout
            →  Send progress notifications back to client
            →  Return formatted results when done
  1. The MCP client sends a tool call (codex or codex-reply)
  2. The server spawns Codex CLI with --json and --full-auto flags
  3. The prompt is passed via stdin
  4. JSONL events are streamed and parsed in real-time
  5. Progress notifications are sent to the client on each event (idle timer resets)
  6. Results (messages, commands, errors, token usage) are formatted as markdown and returned

License

MIT