@howaboua/pi-codex-conversion
v1.0.28
Published
Codex-oriented tool and prompt adapter for pi coding agent
Maintainers
Readme
pi-codex-conversion
Codex-oriented adapter for Pi.
This package replaces Pi's default Codex/GPT experience with a narrower Codex-like surface while staying close to Pi's own runtime and prompt construction:
- swaps active tools to
exec_command,write_stdin,apply_patch,view_image, plus native OpenAI Codex Responsesweb_searchandimage_generationonopenai-codex - saves native Codex image-generation outputs into
.pi/openai-codex-images/at the workspace/repo root and mirrors the newest image to.pi/openai-codex-images/latest.png - preserves Pi's composed system prompt and applies a narrow Codex-oriented delta on top
- renders exec activity with Codex-style command and background-terminal labels
- renders
apply_patchcalls with Codex-styleAdded/Edited/Deleteddiff blocks and Pi-style colored diff lines - targets modern Pi tool/rendering APIs and is aligned with Pi
0.70.x

[!NOTE] Native OpenAI Codex Responses web search activity is surfaced as merged foldable status messages. Pi still does not expose native web-search usage as true tool-call rows.
Active tools in adapter mode
When the adapter is active, the LLM sees these tools:
exec_command— shell execution with Codex-stylecmdparameters and resumable sessionswrite_stdin— continue or poll a running exec sessionapply_patch— patch toolweb_search— native OpenAI Codex Responses web search, enabled only on theopenai-codexproviderimage_generation— native OpenAI Codex Responses image generation, enabled only on image-capableopenai-codexmodelsview_image— image-only wrapper around Pi's native image reading, enabled only for image-capable models
Notably:
- there is no dedicated
read,edit, orwritetool in adapter mode - local text-file inspection should happen through
exec_command - file creation and edits should default to
apply_patch - Pi may still expose additional runtime tools such as
parallel; the prompt is written to tolerate that instead of assuming a fixed four-tool universe
Layout
src/index.ts— extension entrypoint, model gating, tool-set swapping, prompt transformationsrc/adapter/— model detection and active-tool constantssrc/tools/— Pi tool wrappers, exec session management, and execution renderingsrc/shell/— shell tokenization, parsing, and exploration summariessrc/patch/— patch parsing, path policy, and executionsrc/prompt/— Codex delta transformer over Pi's composed prompttests/— deterministic unit tests
Checks
npm run typecheck
npm test
npm run checkExamples
rg -n foo src->Explored / Search foo in srcrg --files src | head -n 50->Explored / List srccat README.md->Explored / Read README.mdexec_command({ cmd: "npm test", yield_time_ms: 1000 })may returnsession_id, then continue withwrite_stdin- for short or non-interactive commands, omitting
yield_time_msis preferred; tiny non-interactive waits are clamped upward to avoid unnecessary follow-up calls write_stdin({ session_id, chars: "" })renders likeWaited for background terminaland is meant for occasional polling, not tight repoll loopswrite_stdin({ session_id, chars: "y\\n" })renders likeInteracted with background terminalview_image({ path: "/absolute/path/to/screenshot.png" })is available on image-capable modelsweb_searchis surfaced only onopenai-codex, and the adapter rewrites it into the native OpenAI Responsestype: "web_search"payload instead of executing a local function toolimage_generationis surfaced only on image-capableopenai-codexmodels, and the adapter rewrites it into the native OpenAI Responsestype: "image_generation", output_format: "png"payload instead of executing a local function tool- native
image_generationoutputs are saved under.pi/openai-codex-images/at the workspace/repo root, with the newest image mirrored to.pi/openai-codex-images/latest.png - when native web search is available, the adapter shows a one-time session notice and merged foldable search-activity messages instead of native tool-call rows
apply_patchpartial failures stay inline in the patch row so successful and failed file entries can be seen together
Raw command output is still available by expanding the tool result.
Install
pi install npm:@howaboua/pi-codex-conversionLocal development:
pi install ./pi-codex-conversionAlternative Git install:
pi install git:github.com/IgorWarzocha/pi-codex-conversionPublishing
This package is already configured for public npm publishes via:
publishConfig.access = "public"prepublishOnly/prepackchecks
Useful commands:
npm run publish:dry-run
npm run publish:dev
npm run release:devWhat they do:
npm run publish:dry-run— inspect what would be publishednpm run publish:dev— publish the current version under thedevdist-tagnpm run release:dev— bump the package to the next-dev.Nprerelease and publish it under thedevdist-tag
Typical flow:
npm login
npm run publish:dry-run
npm run release:devFor modern npm auth, just run npm login and complete the browser flow when prompted.
After publishing, install the dev build with:
pi install npm:@howaboua/pi-codex-conversion@devPrompt behavior
The adapter does not build a standalone replacement prompt anymore. Instead it:
- keeps Pi's tool descriptions, Pi docs section, AGENTS/project context, skills inventory, and date/cwd when Pi already surfaced them
- adds the current shell to the transformed prompt so quoting and escaping can match the runtime environment
- rewrites the top-level role framing to Codex-style wording
- adds a small Codex delta to the existing
Guidelinessection
That keeps the prompt much closer to pi-mono while still steering the model toward Codex-style tool use.
Notes
- Adapter mode activates automatically for OpenAI
gpt*andcodex*models. - When you switch away from those models, Pi restores the previous active tool set.
view_imageresolves paths against the active session cwd and only exposesdetail: "original"for Codex-family image-capable models.web_searchis exposed only for theopenai-codexprovider and is forwarded as the native OpenAI Codex Responses web search tool.image_generationis exposed only for image-capableopenai-codexmodels and is forwarded as the native OpenAI Codex Responses image-generation tool.- generated images are written under
.pi/openai-codex-images/at the workspace/repo root, and the latest image is mirrored to.pi/openai-codex-images/latest.png. apply_patchpaths stay restricted to the current working directory.- partial
apply_patchfailures stay in the original patch block and highlight the failed entry instead of adding a second warning row. exec_command/write_stdinuse a custom PTY-backed session manager vianode-ptyfor interactive sessions.- tiny
exec_commandwaits are clamped for non-interactive commands so short runs do not burn an avoidable follow-up tool call. - empty
write_stdinpolls are clamped to a meaningful minimum wait so long-running processes are not repolled too aggressively. - PTY output handling applies basic terminal rewrite semantics (
\r,\b, erase-in-line, and common escape cleanup) so interactive redraws replay sensibly. - Skills inventory is reintroduced in a Codex-style section when Pi's composed prompt already exposed the underlying Pi skills inventory.
License
MIT
