@kognai/sdk
v0.7.0
Published
Programmatic SDK for the Kognai sovereign runtime — embed agents in your own apps
Maintainers
Readme
@kognai/sdk
Programmatic SDK for the Kognai sovereign runtime. Embed agents directly in
your Node app — no shelling out to npx. Same primitives as @kognai/run,
exposed as functions.
npm install @kognai/sdkimport { runAgent } from '@kognai/sdk';
const result = await runAgent({
agent: 'coder',
task: 'write a python hello world',
});
console.log(result.reply); // "print(\"Hello, World!\")"
console.log(result.llmUsed); // "ollama/qwen3:4b" (or fallback)
console.log(result.charterSource); // "vault" | "fallback" | "skipped"
console.log(result.durationMs); // 1234What you get
runAgent(opts)— high-level: load vault + agent, build system prompt (Charter first), call LLM, optionally append to MEMORY.md, return result.loadVault(dir)/loadAgent(dir, name)/listAgents(dir)— pure file-system readers.buildSystemPrompt(boot, agent)— assemble the system prompt the way@kognai/rundoes it.callLLM(llm, messages)— provider router with auto-fallback.callOllama/callAnthropic— raw provider calls.appendMemoryTurn(...)— append a structured turn record to MEMORY.md.stripCoderFormatting(text)— stripFILE: <path>markers + code fences from a coder agent's reply.
Options
interface RunAgentOptions {
vault?: string; // default: './.kognai'
agent: string; // required — "coder", "agents/coder", etc.
task: string; // required — what the user is asking
llm?: string; // override agent.yaml's llm
remember?: boolean; // append turn to MEMORY.md (default: false)
stripFormatting?: boolean; // strip FILE:/fences (default: true)
injectCharter?: boolean; // inject Five Laws as first prompt block (default: true)
signal?: AbortSignal; // for cancellation
}Auto-fallback
If the agent declares llm: ollama/qwen3:4b but Ollama isn't reachable AND
ANTHROPIC_API_KEY is set, the call transparently falls back to
anthropic/claude-haiku-4-5-20251001. The result tells you what was actually
used:
const r = await runAgent({ agent: 'coder', task: 'hi' });
if (r.llmUsed.startsWith('anthropic/')) console.log('cloud fallback fired');Lower-level use
import {
loadVault, loadAgent, buildSystemPrompt, callLLM, stripCoderFormatting
} from '@kognai/sdk';
const boot = loadVault('./.kognai');
const agent = loadAgent('./.kognai', 'analyst');
const sys = buildSystemPrompt(boot, agent, { includeCharter: true });
const { reply, llmUsed } = await callLLM(agent.llm, [
{ role: 'system', content: sys },
{ role: 'user', content: 'analyse this dataset...' },
]);
console.log(stripCoderFormatting(reply));Errors
All errors thrown by the SDK have a code field for switching:
try {
await runAgent({ agent: 'nonexistent', task: 'hi' });
} catch (e) {
if ((e as any).code === 'AGENT_NOT_FOUND') { /* show agent list */ }
}Codes: VAULT_NOT_FOUND, AGENT_NOT_FOUND, AGENT_PROMPT_MISSING,
NO_LLM_KEY, OLLAMA_UNREACHABLE, OLLAMA_HTTP, ANTHROPIC_HTTP,
UNKNOWN_PROVIDER.
Zero dependencies
Raw fs + http/https. No axios, no @anthropic-ai/sdk, no js-yaml,
no dotenv. The SDK is ~600 lines of TypeScript total.
Relationship to other @kognai packages
| Package | Use |
|---|---|
| @kognai/init | npx @kognai/init — scaffold a vault |
| @kognai/new | npx @kognai/new <name> — scaffold an agent |
| @kognai/run | npx @kognai/run <agent> — CLI to run an agent |
| @kognai/sdk | import { runAgent } from '@kognai/sdk' — programmatic API |
The CLI tools (init, new, run) are for end users on their own machines.
The SDK is for developers embedding Kognai into their own apps.
License
MIT
