prompt-shell-cli
v1.0.0
Published
Terminal UI for working with prompt-shell prompts.
Maintainers
Readme
prompt-shell-cli
A terminal interface for working with prompt-shell context stacks.
What is it?
prompt-shell-cli is a local-first, keyboard-driven terminal UI for managing the modular prompt files defined by prompt-shell.
It allows you to:
- Browse, preview, and edit context files
- Add or delete files with one keystroke
- Reorder prompt sections visually
- Build the final prompt with live token count
No mouse, no boilerplate — just pure CLI clarity.
Why use it?
If you’re using local LLMs, your prompts need to be:
- Modular
- Inspectable
- Reusable
- Quick to evolve
prompt-shell-cli gives you a structured interface to manipulate that context stack without switching between editors, folders, and JSON configs.
Features
- 🧾 View the current context stack (
shell.json) - ✍️ Edit any section with your preferred
$EDITOR - ➕ Create new
.mdcontext files and auto-insert into the build - ❌ Delete sections with confirmation
- 🔀 Reorder prompt parts interactively
- ⚙️ Build the final prompt (uses
prompt-shellunder the hood) - 🔢 Displays live token count after build
How it works
On first run, it creates a local copy of your context and config from the prompt-shell package:
prompt-shell-cli/
├── user-context/
│ ├── context/
│ │ ├── rules.md
│ │ ├── identity.md
│ │ └── ...
│ └── config/
│ └── shell.jsonYou interact with this safe copy — edits, deletions, reorderings, and builds all apply to your local state.
Getting started
- Install dependencies:
npm install- Start the interface:
npm startYou'll see a list of context files defined in shell.json, with one selected and previewed.
Keyboard shortcuts
| Key | Action |
|------------|---------------------------------------------|
| ↑ / ↓ | Navigate files |
| e | Edit selected file in $EDITOR |
| b | Build final prompt (with token count) |
| n | Create new file and append to config |
| x | Delete selected file (with confirmation) |
| r | Reorder context stack (enter reorder mode) |
| Enter | Save reordered order |
| Esc | Cancel reorder |
| q | Quit |
Build process
Press b to run the bundled builder from prompt-shell.
The final prompt is saved to:
user-context/output/final_prompt.txtToken count is printed on success
You can pipe this into any local LLM interface:
ollama run mistral < user-context/output/final_prompt.txtEditor support
prompt-shell-cli respects the $EDITOR environment variable.
If none is set, it defaults to nano.
Examples:
export EDITOR=vim
export EDITOR="code --wait"Who is it for?
- Power users of local LLMs
- Developers building agent frameworks
- Anyone tired of hand-editing Markdown + JSON
- Those who want to compose prompts like code
Author
Michal Roth
💛 If this tool helps you think or build better:
Buy me a coffee →
License
MIT — terminal-first, hackable, and yours.
