@aui-x/prism
v0.0.5
Published
LLM tracing SDK for auix — AI SDK middleware + manual tracing
Maintainers
Readme
@auix/prism
LLM tracing SDK for auix — collect LLM traces with zero friction.
Supports Vercel AI SDK, OpenAI, Anthropic, and manual tracing.
Install
npm install @auix/prismPeer dependencies (install only what you use):
ai(>=6.0.0) +@ai-sdk/provider(>=2.0.0) — for AI SDK integrationopenai(>=4.0.0) — for OpenAI integration@anthropic-ai/sdk(>=0.30.0) — for Anthropic integration
Quick Start
AI SDK (recommended)
prismAISDK wraps any AI SDK model and automatically captures traces, token usage, latency, and tool calls. It creates a root trace internally and returns { model, end }:
import { AuixPrism, prismAISDK } from "@auix/prism";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
const tracer = new AuixPrism({ apiKey: "your-api-key" });
const traced = prismAISDK(tracer, openai("gpt-4o"), {
name: "my-chat",
tags: ["production"],
endUserId: "[email protected]",
metadata: { threadId: "t_123" },
});
const { text, usage } = await generateText({
model: traced.model,
prompt: "Hello!",
});
// Token usage is captured automatically from spans — just call end():
await traced.end();
// Or pass explicit values to override:
await traced.end({
output: text,
totalTokens: usage.totalTokens,
inputTokens: usage.inputTokens,
outputTokens: usage.outputTokens,
});Multi-step tool use is handled correctly — each LLM call becomes a child span under the root trace, and token usage is automatically aggregated:
import { streamText, stepCountIs } from "ai";
const traced = prismAISDK(tracer, openai("gpt-4o"), {
name: "agent",
tags: ["chat"],
});
const result = streamText({
model: traced.model,
prompt: "What's the weather?",
tools: { /* ... */ },
stopWhen: stepCountIs(6),
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
await traced.end();OpenAI
prismOpenAI returns a proxied client that traces chat.completions.create calls:
import { AuixPrism, prismOpenAI } from "@auix/prism";
import OpenAI from "openai";
const tracer = new AuixPrism({ apiKey: "your-api-key" });
const client = prismOpenAI(tracer, new OpenAI(), {
name: "my-chat",
tags: ["production"],
});
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});Anthropic
prismAnthropic returns a proxied client that traces messages.create calls:
import { AuixPrism, prismAnthropic } from "@auix/prism";
import Anthropic from "@anthropic-ai/sdk";
const tracer = new AuixPrism({ apiKey: "your-api-key" });
const client = prismAnthropic(tracer, new Anthropic(), {
name: "my-chat",
tags: ["production"],
});
const response = await client.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello!" }],
});Manual Tracing
For full control over trace structure:
import { AuixPrism } from "@auix/prism";
const tracer = new AuixPrism({ apiKey: "your-api-key" });
const trace = tracer.startTrace({
name: "rag-pipeline",
metadata: { userId: "u_123" },
});
const span = trace.startSpan({
name: "vector-search",
type: "retrieval",
input: { query: "How do I reset my password?" },
});
span.end({ output: { results: 5 }, status: "completed" });
trace.end({ output: "Here's how to reset...", status: "completed" });
await tracer.destroy();API
new AuixPrism(config)
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| apiKey | string | required | Your auix API key |
| baseUrl | string | "https://api.auix.dev" | API endpoint |
| sessionId | string | — | Group traces under a session |
| transport | (events) => Promise<void> | — | Custom transport (bypasses HTTP) |
| onFlushError | (error) => void | — | Called when flush fails after retry |
prismAISDK(tracer, model, options?)
Returns { model, end }. The model is a wrapped AI SDK model. Call end(opts?) after the generation completes to finalize the root trace.
| Option | Type | Description |
|--------|------|-------------|
| name | string | Trace name (defaults to model ID) |
| tags | string[] | Tags for filtering |
| metadata | Record<string, unknown> | Arbitrary metadata |
| endUserId | string | End user identifier |
prismOpenAI(tracer, client, options?)
Returns a proxied OpenAI client. Same options as prismAISDK plus parentTraceId for nesting under an existing trace.
prismAnthropic(tracer, client, options?)
Returns a proxied Anthropic client. Same options as prismOpenAI.
tracer.startTrace(options)
Returns a TraceHandle for manual tracing.
trace.startSpan(options)
Creates a child span. type can be "llm", "tool", "retrieval", or "custom".
tracer.destroy()
Flushes pending events and stops the background timer. Call this when done tracing.
License
Proprietary — see LICENSE.
