one-agent-sdk
v0.1.7
Published
Provider-agnostic SDK for building LLM agents — one API for Claude, Codex, Copilot, and Kimi
Maintainers
Readme
Drop-in replacement for @anthropic-ai/claude-agent-sdk — same API, multiple providers.
Getting Started · Features · Providers · API Reference · Examples
import { query, tool, createSdkMcpServer } from "one-agent-sdk";
// Same API as @anthropic-ai/claude-agent-sdk — swap provider with one option
const conversation = query({
prompt: "What's the weather?",
options: { provider: "codex" }, // or "openai", "anthropic", "openrouter", ...
});The Problem
@anthropic-ai/claude-agent-sdk has a great API — but it only works with Claude Code. If you want to use Codex or Kimi, you have to learn a completely different SDK.
The Solution
One Agent SDK is a drop-in replacement for @anthropic-ai/claude-agent-sdk that routes to any backend. Same query(), tool(), createSdkMcpServer() — just pass options.provider to switch:
const conversation = query({
prompt: "Analyze this code",
- options: { systemPrompt: "You are helpful." },
+ options: { systemPrompt: "You are helpful.", provider: "codex" },
});Everything else stays the same: streaming, tools, message format — all of it.
Supported Providers
CLI Agent Providers
These wrap CLI agent SDKs — no API keys needed, agents run as local subprocesses using your existing CLI authentication.
| Provider | Package | Agent Backend |
| :------- | :------ | :------------ |
| claude-code | @anthropic-ai/claude-agent-sdk | Claude Code |
| codex | @openai/codex-sdk | ChatGPT Codex |
| copilot | @github/copilot-sdk | GitHub Copilot |
| kimi-cli | @moonshot-ai/kimi-agent-sdk | Kimi-CLI |
| gemini-cli | @google/gemini-cli-core | Gemini CLI (planned — pending stable SDK, see #31) |
API-Key Providers
These call LLM HTTP APIs directly with API keys — no CLI tooling required.
| Provider | Package | API Backend |
| :------- | :------ | :---------- |
| openai | openai | OpenAI API (GPT-4o, etc.) |
| anthropic | @anthropic-ai/sdk | Anthropic API (Claude Sonnet, etc.) |
| openrouter | openai | OpenRouter (any model) |
All providers are optional peer dependencies — install only what you need. You can also register custom providers.
Getting Started
Prerequisites
- Node.js v18+ or Bun
- At least one provider: either a CLI agent SDK installed and authenticated, or an API key
Install
npm install one-agent-sdkThen install your provider:
# CLI agent providers (pick one or more)
npm install @anthropic-ai/claude-agent-sdk
npm install @openai/codex-sdk
npm install @github/copilot-sdk
npm install @moonshot-ai/kimi-agent-sdk
# API-key providers (pick one or more)
npm install openai # for "openai" or "openrouter" provider
npm install @anthropic-ai/sdk # for "anthropic" providerQuick Start
import { z } from "zod";
import { query, tool, createSdkMcpServer } from "one-agent-sdk";
const weatherTool = tool(
"get_weather",
"Get the current weather for a city",
{ city: z.string().describe("City name") },
async ({ city }) => ({
content: [{ type: "text" as const, text: JSON.stringify({ city, temperature: 72, condition: "sunny" }) }],
}),
);
const mcpServer = createSdkMcpServer({
name: "tools",
version: "1.0.0",
tools: [weatherTool],
});
const conversation = query({
prompt: "What's the weather in San Francisco?",
options: {
systemPrompt: "You are a helpful assistant. Use the weather tool when asked about weather.",
mcpServers: { tools: mcpServer },
allowedTools: ["mcp__tools__get_weather"],
},
});
for await (const msg of conversation) {
if (msg.type === "assistant" && msg.message?.content) {
for (const block of msg.message.content) {
if ("text" in block && block.text) process.stdout.write(block.text);
}
}
}[!TIP] To switch providers, add
provider: "codex",provider: "openai", etc. tooptions. Defaults to"claude-code".
Features
Multi-Provider Support
Same code, different backend — just change options.provider:
import { query } from "one-agent-sdk";
// Use Claude (default)
const claude = query({ prompt: "Explain this code" });
// Use Codex
const codex = query({ prompt: "Explain this code", options: { provider: "codex" } });
// Use OpenAI API directly
const openai = query({ prompt: "Explain this code", options: { provider: "openai" } });
// Use Anthropic API directly
const anthropic = query({ prompt: "Explain this code", options: { provider: "anthropic" } });
// Use any model via OpenRouter
const openrouter = query({ prompt: "Explain this code", options: { provider: "openrouter", model: "anthropic/claude-sonnet-4" } });The output stream always emits the same SDKMessage format, regardless of provider.
Custom Providers
Register your own provider backend and use it with query():
import { registerProvider } from "one-agent-sdk";
import { query } from "one-agent-sdk";
registerProvider("my-llm", async (config) => ({
async *run(prompt) {
yield { type: "text", text: "Hello from my-llm!" };
yield { type: "done" };
},
async *chat(msg) {
yield { type: "text", text: msg };
yield { type: "done" };
},
async close() {},
}));
const conversation = query({ prompt: "Hi", options: { provider: "my-llm" } });How It Works
graph LR
A["query(prompt, options)"] --> B{options.provider}
B -->|claude-code| C["@anthropic-ai/claude-agent-sdk"]
B -->|codex| D["@openai/codex-sdk"]
B -->|copilot| E["@github/copilot-sdk"]
B -->|kimi-cli| F["@moonshot-ai/kimi-agent-sdk"]
B -->|openai| G["OpenAI API"]
B -->|anthropic| H["Anthropic API"]
B -->|openrouter| I["OpenRouter API"]
B -->|custom| J[Registered Provider]
C --> K[SDKMessage Stream]
D --> K
E --> K
F --> K
G --> K
H --> K
I --> K
J --> Kclaude-code(default) — delegates directly to the real Anthropic SDK. Full fidelity, zero overhead.- CLI providers (
codex,copilot,kimi-cli) — wraps CLI agent SDKs, adapts output toSDKMessageformat. - API providers (
openai,anthropic,openrouter) — calls LLM APIs directly with API keys, manages multi-turn tool loops internally.
[!NOTE] Provider SDKs are dynamically imported at runtime — unused providers are never loaded.
API Reference
import { query, tool, createSdkMcpServer } from "one-agent-sdk";100% API-compatible with @anthropic-ai/claude-agent-sdk. All exports are identical — see the Anthropic Agent SDK docs for the full reference.
Added by One Agent SDK:
| Option | Description |
| :------- | :---------- |
| options.provider | Route to a different backend: "claude-code" (default), "codex", "copilot", "kimi-cli", "openai", "anthropic", "openrouter", or any registered custom provider |
| Helper | Description |
| :------- | :---------- |
| registerProvider(name, factory) | Register a custom provider backend (import from one-agent-sdk) |
For full API documentation, see the docs site.
Examples
The examples/ directory contains runnable demos:
| Example | Description |
| :------ | :---------- |
| claude.ts | Claude with tools via query() + tool() |
| codex.ts | Codex backend |
| copilot.ts | GitHub Copilot backend |
| kimi.ts | Kimi backend |
| openai-api.ts | OpenAI API with tools |
| anthropic-api.ts | Anthropic API with tools |
| openrouter.ts | OpenRouter (any model) |
| hello.ts | Minimal example (legacy API) |
| multi-agent.ts | Multi-agent handoffs (legacy API) |
npx tsx examples/hello.tsLegacy API (Deprecated)
The following functions are exported from one-agent-sdk and will be removed in v0.2. Migrate to one-agent-sdk instead.
| Function | Replacement |
| :------- | :---------- |
| run(prompt, config) | query({ prompt, options }) |
| runToCompletion(prompt, config) | query({ prompt, options }) + collect results |
| defineAgent({...}) | Pass agent config directly via query() options |
| defineTool({...}) | tool(name, description, schema, handler) |
Contributing
Contributions are welcome! Please see the contributing guide for details.
