@augment-adk/augment-adk
v0.1.12
Published
Agent Development Kit for multi-agent orchestration over the Responses API via LlamaStack
Downloads
1,442
Maintainers
Readme
Augment ADK
A lightweight TypeScript SDK for building multi-agent workflows over the Responses API via LlamaStack. Inspired by the OpenAI Agents JS SDK, zero external runtime dependencies.
Core concepts
- Agents: LLMs configured with instructions, tools, guardrails, and handoffs
- Handoffs: Delegating to other agents via typed agent graphs with validation
- Tools: Function tools, MCP tool integration, and hosted tool factories
- Guardrails: Configurable safety checks for input and output validation
- Human in the loop: Built-in approval store with MCP approval flow
- Sessions: Conversation history management across agent runs
- Tracing: Built-in tracking of agent runs for debugging and optimization
- Streaming: SSE normalization for real-time token streaming
Explore the examples/ directory to see the SDK in action.
Get started
Supported environments
- Node.js 18 or later
- Any TypeScript runtime (Deno, Bun)
Installation
npm install @augment-adk/augment-adkRun your first agent
import { run, LlamaStackModel } from '@augment-adk/augment-adk';
const model = new LlamaStackModel({
clientConfig: { baseUrl: 'http://localhost:8321' },
});
const result = await run('What is the capital of France?', {
model,
agents: {
assistant: {
name: 'Assistant',
instructions: 'You are a helpful assistant.',
},
},
defaultAgent: 'assistant',
config: {
model: 'meta-llama/Llama-3.1-8B-Instruct',
baseUrl: 'http://localhost:8321',
systemPrompt: 'You are a helpful assistant.',
enableWebSearch: false,
enableCodeInterpreter: false,
vectorStoreIds: [],
vectorStoreName: '',
embeddingModel: '',
embeddingDimension: 384,
chunkingStrategy: 'auto',
maxChunkSizeTokens: 800,
chunkOverlapTokens: 400,
skipTlsVerify: true,
zdrMode: false,
verboseStreamLogging: false,
},
});
console.log(result.content);Optional: Chat Completions for local development
For local testing with Ollama, vLLM, or other Chat Completions providers, install the optional adapter:
npm install @augment-adk/adk-chat-completionsimport { run } from '@augment-adk/augment-adk';
import { ChatCompletionsModel } from '@augment-adk/adk-chat-completions';
const model = new ChatCompletionsModel({
clientConfig: {
baseUrl: 'http://localhost:11434',
token: process.env.API_KEY,
},
});See the chat-completions example for a complete walkthrough.
Examples
| Example | Description |
|---------|-------------|
| basic | Single-agent question answering |
| chat-completions | Chat Completions backend via optional @augment-adk/adk-chat-completions |
| multi-agent | Router + specialist agent graph with handoffs |
| mcp-tools | Function tools and hosted MCP tool integration |
| human-in-the-loop | Approval workflows for destructive operations |
| backstage-plugin | Integrating ADK into a Backstage backend plugin |
Packages
The SDK is organized as a monorepo with focused packages:
| Package | Description |
|---------|-------------|
| @augment-adk/augment-adk | Batteries-included entry point (core + LlamaStack) |
| @augment-adk/adk-core | Provider-agnostic core: agents, runner, tools, guardrails, approval, streaming, tracing |
| @augment-adk/adk-llamastack | LlamaStack Responses API model provider |
| @augment-adk/adk-chat-completions | Chat Completions adapter (optional, separate install) |
Most users should install @augment-adk/augment-adk. Advanced consumers can import individual packages for lighter bundles:
import { run, Agent } from '@augment-adk/adk-core';
import { LlamaStackModel } from '@augment-adk/adk-llamastack';Architecture
See ARCHITECTURE.md for design decisions, extension points, and how the run loop works. Start there if you want to add a new model provider, integrate with a different framework, or understand how the codebase is structured.
Development
# Install dependencies
pnpm install
# Build all packages (in dependency order)
pnpm -r build
# Run all tests
pnpm -r test
# Type-check all packages
pnpm -r typecheck
# Lint
pnpm -r lintAcknowledgements
We'd like to acknowledge the excellent work of the open-source community, especially:
- OpenAI Agents JS SDK (architectural inspiration)
- LlamaStack (Responses API backend)
- zod (optional schema validation)
- vitest and tsup
- pnpm
License
Apache-2.0 — see LICENSE.
