@kognai/run
v1.2.0
Published
Run a Kognai agent from your sovereign vault — npx @kognai/run <agent>
Maintainers
Readme
@kognai/run
Run a Kognai agent from your sovereign vault. Reads the bootstrap files
(SOUL.md, IDENTITY.md, USER.md, MEMORY.md) + the agent's prompt.md,
assembles a system prompt, and executes the task against the model declared in
the agent's agent.yaml.
# scaffold a vault first if you haven't
npx @kognai/init
# then run the included coder agent
echo "write a python hello world" | npx @kognai/run coderUsage
npx @kognai/run <agent> # task from stdin
npx @kognai/run <agent> --task "<text>" # task from CLI arg
npx @kognai/run <agent> --vault <path> # use a non-default vault dir
npx @kognai/run <agent> --llm <provider/model># override agent.yaml's llm| Flag | Default | Notes |
|------------------|--------------|-------------------------------------------------|
| --vault <path> | ./.kognai | Vault directory |
| --task <text> | stdin | Task description |
| --llm <p/m> | from yaml | e.g. ollama/qwen3:14b, anthropic/claude-haiku-4-5-20251001 |
| --help, -h | — | Show usage |
Routing
agent.yaml's llm: field decides where the call goes:
ollama/<model>→ POSThttp://127.0.0.1:11434/api/chat(override host withOLLAMA_HOST)anthropic/<model>→ POST Anthropic Messages API (requiresANTHROPIC_API_KEY)
If agent.yaml is missing or has no llm: field, defaults to ollama/qwen3:4b.
Environment
OLLAMA_HOST— Ollama base URL (defaulthttp://127.0.0.1:11434)ANTHROPIC_API_KEY— required when llm provider isanthropic
The CLI auto-loads a .env file from your vault dir AND your current working
directory (in that order). No dotenv dep required.
Zero npm dependencies
Raw HTTPS to Ollama / Anthropic. No @anthropic-ai/sdk, no axios, no
js-yaml. The agent.yaml parser is a single-line llm: lookup; full yaml
parsing isn't needed at runtime.
License
MIT
