@onioneko/kiki-llm-anthropic
v0.1.3
Published
Anthropic Claude LLM provider middleware for @onioneko/kiki-core
Downloads
329
Maintainers
Readme
@onioneko/kiki-llm-anthropic
Kiki · Middleware framework for LLM agents
npm install @onioneko/kiki-llm-anthropicAnthropic Claude LLM provider as middleware. Calls the API and pushes the assistant response onto ctx.messages, then calls next() to allow response-processing middleware downstream.
Usage
import { compose } from "@onioneko/kiki-core";
import { createAnthropicMiddleware } from "@onioneko/kiki-llm-anthropic";
const chain = compose([
createAnthropicMiddleware({ model: "claude-sonnet-4-20250514" }),
]);
const ctx = {
messages: [{ role: "user", content: "What is 2 + 2?" }],
system: "Be concise.",
};
await chain(ctx);
// ctx.messages[1] is the assistant responseConfiguration
model is required. Other fields are optional with sensible defaults.
createAnthropicMiddleware({
model: "claude-sonnet-4-20250514", // required
apiKey: "...", // or ANTHROPIC_API_KEY env var
maxTokens: 16384, // default: 16384
client: anthropic, // optional: injected Anthropic client (for testing)
});Behavior
- Non-terminal — calls
next()after pushing the response, allowing response-processing middleware to be composed after the LLM. - Prompt caching — applies
cache_controlon the system prompt and the penultimate user message. - Multi-modal — handles both
stringandContentBlock[]message content. - Config precedence — explicit config > defaults.
Exports
createAnthropicMiddleware(config)— Factory function returning aMiddlewareAnthropicConfig— Configuration type- Re-exports from
@onioneko/kiki-types:Message,ConversationContext,ContentBlock,TextBlock,ImageBlock
