@sisu-ai/cli
v0.5.1
Published
CLI for discovering Sisu packages, scaffolding maintained starter projects, and running an interactive automation chat.
Maintainers
Readme
@sisu-ai/cli
CLI for discovering Sisu packages, scaffolding maintained starter projects, and running an interactive automation chat.
Usage
npx @sisu-ai/cli list tools
npx @sisu-ai/cli info vector-vectra
npx @sisu-ai/cli create chat-agent my-app
npx @sisu-ai/cli install skill
npx @sisu-ai/cli chatAfter global install, you can also run:
sisu list tools
sisu chatCommands
sisu list <category>sisu info <name>sisu create <template> <project-name>sisu install skill [installer-options]sisu chat [--session <session-id>] [--prompt <text>]sisu --versionsisu --json list <category>
Categories:
librariesmiddlewaretoolsadaptersvectorskillstemplates
Chat command
This version introduces a first-class interactive chat mode for daily CLI workflows.
Core flow
- Start interactive mode:
sisu chat(Ink UI by default in TTY) - Run one-shot prompt:
sisu chat --prompt "run: git status" - Pipe prompt from stdin:
echo "hello" | sisu chat - Resume a known session:
sisu chat --session <session-id> - Startup now uses cached provider/model immediately and runs provider health checks in the background.
In-chat commands
/help- show command help/new- start a brand new chat session/provider [ollama|openai|anthropic|mock]- set provider (interactive picker if omitted)/model [name]- set model (interactive picker if omitted)/cancel- cancel active run/tool execution/sessions- list persisted sessions and choose resume/delete action/delete-session <session-id>- delete a saved session directly/search <query>- search conversation history/resume <session-id>- switch to a prior session/branch <message-id>- create a new branch session from a prior message/exit- close chat/options- open interactive options menu/settings- open interactive settings menu
Tool safety model
Tool executions are policy-gated before execution:
- allow: command runs immediately
- confirm: explicit user approval is required
- deny: command is blocked with a reason
High-impact commands require confirmation by default. Denied and completed actions are persisted in session records with status and metadata.
Ink shortcuts and menus
Ctrl+Oopens the options menu (new session, switch session, branch, help, exit).Shift+Sopens settings (provider/model/session switching).Shift+Enterinserts a newline in the input box for multiline messages.Ctrl+Jis supported as a fallback in terminals that don't expose Shift+Enter distinctly.- Menus support
↑/↓to navigate,Enterto select, andEscto close. - Assistant output is markdown-aware in terminal rendering (headers/lists/code blocks are formatted for readability).
Profiles and configuration
Chat profile resolution uses deterministic precedence:
- Built-in defaults
- Global profile:
~/.sisu/chat-profile.json - Project profile:
<project>/.sisu/chat-profile.json(overrides global)
Example profile:
{
"name": "default",
"provider": "ollama",
"model": "qwen3.5:9b",
"theme": "auto",
"storageDir": "/Users/you/.sisu/chat-sessions/my-project",
"toolPolicy": {
"mode": "balanced",
"requireConfirmationForHighImpact": true,
"allowCommandPrefixes": ["echo", "ls", "git status", "pnpm test"]
}
}Provider notes:
mock: local fallback with no external API calls.openai: setOPENAI_API_KEY(orAPI_KEY) and choose a valid OpenAI model.anthropic: setANTHROPIC_API_KEY(orAPI_KEY) and choose a valid Claude model.ollama: ensureollama serveis running and use a locally available model.
Default provider behavior:
- If no provider is configured, chat auto-detects local Ollama models (
ollama list) and defaults toollama. - Preferred Ollama defaults are selected in this order when available:
qwen3.5:9b,llama3.1,llama4,qwen3.5:0.8b. - If no local Ollama models are found, chat falls back to
mock.
Session persistence
Chat sessions are persisted locally (messages, run state, tool lifecycle records, events). This enables:
- deterministic restart/resume behavior
- session search and retrieval
- branch-from-message lineage workflows
Templates
chat-agent— minimal conversational startercli-agent— single-shot CLI prompt starterrag-agent— local Vectra-backed RAG starter
Why This Exists
Sisu already has a lot of maintained middleware, tools, adapters, and examples. This CLI gives humans and agents a direct way to discover them before inventing custom framework code.
It also provides a built-in path to install the sisu-framework skill:
npx @sisu-ai/cli install skillContributing
We build Sisu in the open. Contributions welcome.
Contributing Guide · Report a Bug · Request a Feature · Code of Conduct
- @sisu-ai/skill-code-review
- @sisu-ai/skill-debug
- @sisu-ai/skill-deploy
- @sisu-ai/skill-explain
- @sisu-ai/skill-repo-search
- @sisu-ai/skill-test-gen
- @sisu-ai/skill-install
Star on GitHub if Sisu helps you build better agents.
Quiet, determined, relentlessly useful.
