npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@chrishdx/n8n-nodes-codex-cli-lm

v1.0.0

Published

n8n Community Node for Codex CLI as LangChain Chat Model

Readme

n8n-nodes-codex-cli-lm

Community node package for n8n that exposes the OpenAI Codex CLI as a LangChain Chat Model. It runs codex exec --experimental-json using the bundled @openai/codex-sdk binary or a custom codex binary.

Nodes included

  • Codex Chat Model (SDK): LangChain chat model for the AI Agent node
  • Codex Auth (Device Login): device login helper that writes CODEX_HOME/auth.json on the n8n host

Features

  • Codex CLI integration: spawns codex exec --experimental-json and parses JSONL events
  • Auth: device login via CLI or UI-based device flow
  • Model selection: static list in the node (Codex + GPT-5.x entries) or default model
  • Security controls: sandbox mode, approval policy, optional web search + network access
  • Advanced config: --config key=value overrides and --enable/--disable feature flags
  • Multimodal: attach local images or data: images from LangChain content
  • Structured output: optional JSON Schema via --output-schema
  • Streaming: optional response streaming from JSONL item.updated / item.completed
  • Context controls: cap message count or prompt size in stateless mode

Requirements

  • Self-hosted n8n only (uses child_process and filesystem access)
  • n8n >= 1.0.0
  • Node.js >= 18
  • Codex CLI available via:
    • Bundled binary from @openai/codex-sdk, or
    • Custom Path to a codex binary on the n8n host

Installation

Option 1: Install from npm (once published)

npm install @chrishdx/n8n-nodes-codex-cli-lm

Option 2: Install locally for development

git clone <your-repo-url>
cd n8n-nodes-codex-cli-lm
npm install
npm run build
npm link

Then in your n8n installation directory:

npm link @chrishdx/n8n-nodes-codex-cli-lm
n8n start

Credentials setup

Create Codex (SDK) API credentials in n8n:

  • Codex Binary: Bundled (via @openai/codex-sdk) or Custom Path
  • Codex Path: path to codex if using Custom Path
  • Base URL (OPENAI_BASE_URL): optional proxy or enterprise gateway
  • Codex Home (CODEX_HOME): optional; where Codex stores sessions/config (default: ~/.codex)

The credential test pings the OpenAI auth discovery endpoint to verify connectivity.

Login options (device flow + web)

You must complete Codex login on the same host/container where n8n runs so CODEX_HOME/auth.json is available.

Option A: CLI device login (recommended)

Run on the n8n host:

codex login --device-auth

Then open the provided URL in your browser and enter the device code. Make sure CODEX_HOME in n8n points to the same directory used by the CLI.

If you use the Bundled binary, codex might not be on your PATH. You can either:

  • Use Custom Path and install codex globally, or
  • Run the bundled binary directly from node_modules/@openai/codex-sdk/vendor/.../codex/codex

Option B: Web login from the n8n UI

Use the Codex Auth (Device Login) node:

  1. Start Device Login: run the node and copy verificationUrl and userCode
  2. Open the URL in your browser, sign in, and enter the code
  3. Complete Device Login: provide deviceAuthId + userCode from step 1
  4. (Optional) Login Status to check the current session
  5. (Optional) Logout to remove auth.json

This writes tokens to CODEX_HOME/auth.json on the n8n host.

Using the Codex Chat Model

  1. Add an AI Agent node
  2. Select Codex Chat Model (SDK) as the chat model
  3. Choose your Codex (SDK) API credentials
  4. Configure Model and Options as needed

Options quick reference

  • Working Directory and Additional Directories for filesystem access
  • Sandbox Mode and Approval Policy for command execution safety
  • Network Access Enabled and Web Search Enabled for external access
  • Use Open Source Provider and Local Provider for --oss flows
  • Output Schema (JSON) for structured results
  • Stream Response for token streaming
  • Max Messages and Max Prompt Characters for context control

Example workflow: Telegram chatbot

A simple flow that turns Telegram messages into Codex responses:

  1. Telegram Trigger (incoming message)
  2. AI Agent using Codex Chat Model (SDK)
  3. Telegram node (Send Message)

Typical mappings:

  • AI Agent input: use Telegram message.text (for example, {{$json.message.text}})
  • Telegram response: map the AI Agent output text (inspect the AI Agent output in n8n and map its response field)

Notes / limitations

  • This package runs a local binary and uses filesystem access. It is intended for self-hosted n8n.
  • Tool calling is supported via a prompt-based JSON adapter. It is best-effort and depends on the model output.

Development

Project structure

n8n-nodes-codex-cli-lm/
├── credentials/
│   └── CodexCliApi.credentials.ts
├── nodes/
│   ├── CodexAuth/
│   └── LmChatCodexCli/
├── package.json
└── tsconfig.json

Scripts

  • npm run dev - Start n8n with hot reload for development
  • npm run build - Build the TypeScript code
  • npm run lint - Check code for errors
  • npm run lint:fix - Auto-fix linting issues

Troubleshooting

Node does not appear in n8n

  • Verify install: npm list @chrishdx/n8n-nodes-codex-cli-lm
  • Restart n8n
  • Check n8n logs for load errors

CLI execution fails

  • Verify the Codex binary source (Bundled vs Custom Path)
  • Check the binary manually: codex --version
  • Ensure CODEX_HOME contains auth.json from device login

Login issues

  • Confirm the login happened on the same host/container as n8n
  • Use the Codex Auth (Device Login) node to re-run device login

License

MIT

Support

  • GitHub Issues: https://github.com/chrishdx/n8n-nodes-codex-cli-lm/issues
  • n8n Community Forum: https://community.n8n.io

Acknowledgments

  • Built on the n8n community nodes starter
  • Powered by LangChain
  • Uses the official @openai/codex-sdk