@ryan_nookpi/pi-extension-codex-fast-mode
v0.2.1
Published
Codex fast mode extension for pi.
Readme
@ryan_nookpi/pi-extension-codex-fast-mode
This extension helps pi use OpenAI Codex in a faster, lower-verbosity mode.
It is mainly intended for openai-codex with gpt-5.4 or gpt-5.5, where you want quick execution and shorter responses.
Install
pi install npm:@ryan_nookpi/pi-extension-codex-fast-modeGreat for
- prioritizing speed over long explanations
- keeping Codex responses concise
- toggling a faster Codex setup per session
Usage
/codex-fast on
/codex-fast off
/codex-fast statusNotes
- Target models:
openai-codex / gpt-5.4andopenai-codex / gpt-5.5 - It always applies
text.verbosity=low. - When fast mode is enabled, it also injects
service_tier=priority. - The setting is stored locally and persists across sessions.
