@arki-moe/agent-ts
v6.0.1
Published
Minimal Agent library, zero dependencies
Readme
agent-ts
Minimal Agent library, zero dependencies
Usage Example
import { Agent, Role, type Tool } from "@arki-moe/agent-ts";
const getTimeTool: Tool = {
name: "get_time",
description: "Get the current time in ISO format",
parameters: { type: "object", properties: {} },
execute: (_args, _agent) => new Date().toISOString(),
};
const agent = new Agent("openai", {
apiKey: "sk-...",
model: "gpt-5-nano",
system: "You are a helpful assistant. Reply concisely.",
onStream: (textDelta) => {
process.stdout.write(textDelta);
},
onToolCall: (message, args, agent) => {
console.log("tool call:", message);
console.log("tool args:", args);
console.log("agent context length:", agent.context.length);
},
onToolResult: (msg, agent) => console.log("tool result:", msg, agent.context.length),
});
agent.registerTool(getTimeTool);
// run: Executes tool chain automatically, returns new messages and usage, context is maintained automatically
const result = await agent.run("What time is it?");
console.log(result.messages);
console.log(result.usage, result.usageSum);
// run also accepts multiple user messages or Message objects
await agent.run(["Hello", "What's next?"]);
await agent.run({ role: Role.System, content: "You are brief." });
// context is a public property that can be read directly
console.log(agent.context);Supported Adapters
| Adapter | Required | Optional |
|---------|----------|----------|
| openai | apiKey (or OPENAI_API_KEY env), model | system, baseUrl |
| openrouter | apiKey (or OPENROUTER_API_KEY env), model | system, baseUrl, httpReferer, title |
| selfhost_chat_completions | apiKey (or SELFHOST_API_KEY env), model | system, baseUrl |
When apiKey is not provided in config, adapters read from the corresponding environment variable. An error is thrown only when both are missing.
API
Agent(adapterName, config)- Create Agentagent.context- Public property, complete conversation historyagent.registerTool(tool)- Register toolagent.run(message)- Execute tool chain automatically, returns{ messages, usage, usageSum }messagecan bestring | string[] | Message | Message[]
Config
| Field | Type | Description |
|-------|------|-------------|
| apiKey | string | API key (or use env var) |
| model | string | Model name |
| system | string | Optional system prompt |
| endCondition | (context, last) => boolean | Stop condition for run. Defaults to last.role === Role.Ai |
| onStream | (textDelta: string) => void \| Promise<void> | Stream hook for AI text only. When provided, adapters use SSE streaming and still return the final Message[]. |
| isAbort | () => boolean \| Promise<boolean> | Abort hook polled during run; return true to stop early and return partial results. |
| toolChoice | "auto" \| "required" \| "none" \| object | Tool choice passed to adapters. Defaults to auto when tools are provided. |
| onToolCall | (message, args, agent) => boolean \| void \| Promise<boolean \| void> | Called before each tool execution; return false to skip tool execution and onToolResult |
| onToolResult | (message, agent) => void \| Promise<void> | Called after each tool execution (message.role === Role.ToolResult) |
agent.run always appends new messages to agent.context. Multiple tool calls in a single model response are executed in parallel.
onToolCall receives parsed JSON args and can mutate them before execution. Returning false skips the tool call and does not emit a ToolResult message.
Role.ToolCall supports an optional data field as a local JSON metadata bag for your own use and is not sent to adapters.
isAbort is polled on the hot path (including streaming). When it returns true, agent.run stops and returns whatever messages it has so far. Streaming abort returns a partial AI message with isPartial: true.
Scripts
| Command | Description |
|---------|--------------|
| pnpm run build | Compile TypeScript to dist/ |
| pnpm run check | Type check (no emit) |
| pnpm test | Run tests (API key read from env var for real API tests) |
