npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@kakugo-ch/codex-batch-cli

v1.1.1

Published

A CLI tool for batch processing questions using OpenAI Codex API

Readme

Codex Batch CLI

A robust CLI tool for batch processing questions using OpenAI's CLI tools. Process multiple questions at once, handle failures gracefully, and generate comprehensive reports.

Features

  • Batch Processing: Handle multiple questions in a single run
  • Smart Retry: Automatically retry failed queries with --retry flag
  • Continuous Processing: Automatically continues from where it left off if interrupted
  • Parallel Processing: Process multiple questions concurrently with configurable concurrency
  • Detailed Reports: Generate structured JSON reports with answers and errors
  • Template Support: Define reusable templates with variables for common question patterns
  • Error Handling: Robust error handling with configurable timeouts and retries
  • Automatic Security: Sensitive data like API keys and tokens are automatically masked in logs and error messages
  • Easy to Use: Simple CLI interface with clear commands
  • Flexible CLI Support: Works with different OpenAI CLI tools (codex, open-codex, etc.)

Quick Start

  1. Install your preferred OpenAI CLI tool:

    # For example, install OpenAI's Codex CLI:
    npm install -g @openai/codex
    # Or install open-codex:
    npm install -g @openai/open-codex
  2. Install this CLI:

    npm install -g @kakugo-ch/codex-batch-cli
  3. Set your OpenAI API key:

    export OPENAI_API_KEY="your_api_key_here"
  4. Create a questions.json file. You can use direct questions or templates:

    {
      "templates": [
        {
          "id": "code-review",
          "content": "Please review the following {{ language }} code:\n\n```{{ language }}\n{{ code }}\n```\n\nFocus on: {{ focus }}"
        }
      ],
      "questions": [
        {
          "id": "q1",
          "template": "code-review",
          "variables": {
            "language": "javascript",
            "code": "function add(a,b) { return a+b }",
            "focus": "code style and type safety"
          }
        },
        {
          "id": "q2",
          "text": "What is the language of this repository?"
        }
      ]
    }
  5. Run the CLI:

    codex-batch -i questions.json -o report.json

Prerequisites

  • Node.js >= 14.0.0
  • OpenAI API key (set as OPENAI_API_KEY environment variable)
  • OpenAI CLI tool (npm install -g @openai/codex or npm install -g @openai/open-codex)

Commands

Process New Questions

codex-batch -i questions.json -o report.json

Retry Failed Questions

codex-batch --retry report.json

Options

  • -i, --input <file> - Input questions file
  • -o, --output <file> - Output report file (default: report.json)
  • -r, --repo <path> - Path to the repository to analyze (default: current directory)
  • --retry <file> - Retry failed questions from a report
  • --max-retries <number> - Maximum retries for failed queries (default: 3)
  • --timeout <number> - Timeout in milliseconds (default: 600000)
  • --concurrency <number> - Number of questions to process in parallel (default: 1)
  • --cli-args <args> - Additional arguments to pass to the CLI (e.g. "--temperature 0.7")
  • --executable-name <name> - Name of the CLI executable to use (default: "codex")
  • -h, --help - Show help

Input Format

The input file can contain both templates and direct questions:

Templates

Templates use Liquid syntax and can be defined in the templates array:

{
  "templates": [
    {
      "id": "template-id",
      "content": "Template content with {{ variable }} placeholders"
    }
  ]
}

Questions

Questions can either use a template or provide direct text:

{
  "questions": [
    {
      "id": "question-1",
      "template": "template-id",
      "variables": {
        "variable": "value"
      }
    },
    {
      "id": "question-2",
      "text": "Direct question without template"
    }
  ]
}

Examples

Basic Usage

# Using default codex CLI sequentially
codex-batch -i questions.json -o report.json

# Using codex CLI with parallel processing (10 questions at a time)
codex-batch -i questions.json -o report.json --concurrency 10

# High concurrency for large batches (50 questions at a time)
codex-batch -i questions.json -o report.json --concurrency 50

# Using open-codex CLI
codex-batch -i questions.json -o report.json --executable-name open-codex

If the command is interrupted (e.g., by Ctrl+C), running it again with the same output file will:

  1. Skip questions that were already successfully processed
  2. Retry questions that failed
  3. Process remaining questions that weren't attempted yet

Analyze Different Repository

codex-batch -i questions.json -r /path/to/repo

With Custom CLI Parameters

# Using codex with custom parameters
codex-batch -i questions.json --cli-args "--temperature 0.7 --max-tokens 100"

# Using open-codex with custom parameters
codex-batch -i questions.json --executable-name open-codex --cli-args "--temperature 0.7 --max-tokens 100"

Retry Failed Questions with Different Parameters

codex-batch --retry report.json --cli-args "--temperature 0.9"

Report Format

{
  "timestamp": "2025-04-27T19:00:00.000Z",
  "totalQuestions": 2,
  "successfulQueries": 1,
  "failedQueries": 1,
  "answers": [
    {
      "questionId": "q1",
      "question": "What is the language of this repository?",
      "answer": "This repository is written in TypeScript.",
      "startTime": "2025-04-27T18:59:58.123Z",
      "endTime": "2025-04-27T19:00:00.000Z",
      "durationMs": 1877
    },
    {
      "questionId": "q2",
      "question": "Is this repository using SQL databases?",
      "error": "Timeout error",
      "startTime": "2025-04-27T19:00:00.001Z",
      "endTime": "2025-04-27T19:00:30.001Z",
      "durationMs": 30000
    }
  ],
  "startTime": "2025-04-27T18:59:58.123Z",
  "endTime": "2025-04-27T19:00:30.001Z",
  "durationMs": 31878,
  "fastestAnswerMs": 1877,
  "slowestAnswerMs": 30000,
  "averageAnswerMs": 15938
}

Development

After making changes:

npm run build && npm test && npm link

Release

npm run build && npm test && npm pack && npm publish