npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@aui-x/prism

v0.0.5

Published

LLM tracing SDK for auix — AI SDK middleware + manual tracing

Readme

@auix/prism

LLM tracing SDK for auix — collect LLM traces with zero friction.

Supports Vercel AI SDK, OpenAI, Anthropic, and manual tracing.

Install

npm install @auix/prism

Peer dependencies (install only what you use):

  • ai (>=6.0.0) + @ai-sdk/provider (>=2.0.0) — for AI SDK integration
  • openai (>=4.0.0) — for OpenAI integration
  • @anthropic-ai/sdk (>=0.30.0) — for Anthropic integration

Quick Start

AI SDK (recommended)

prismAISDK wraps any AI SDK model and automatically captures traces, token usage, latency, and tool calls. It creates a root trace internally and returns { model, end }:

import { AuixPrism, prismAISDK } from "@auix/prism";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

const tracer = new AuixPrism({ apiKey: "your-api-key" });

const traced = prismAISDK(tracer, openai("gpt-4o"), {
  name: "my-chat",
  tags: ["production"],
  endUserId: "[email protected]",
  metadata: { threadId: "t_123" },
});

const { text, usage } = await generateText({
  model: traced.model,
  prompt: "Hello!",
});

// Token usage is captured automatically from spans — just call end():
await traced.end();

// Or pass explicit values to override:
await traced.end({
  output: text,
  totalTokens: usage.totalTokens,
  inputTokens: usage.inputTokens,
  outputTokens: usage.outputTokens,
});

Multi-step tool use is handled correctly — each LLM call becomes a child span under the root trace, and token usage is automatically aggregated:

import { streamText, stepCountIs } from "ai";

const traced = prismAISDK(tracer, openai("gpt-4o"), {
  name: "agent",
  tags: ["chat"],
});

const result = streamText({
  model: traced.model,
  prompt: "What's the weather?",
  tools: { /* ... */ },
  stopWhen: stepCountIs(6),
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

await traced.end();

OpenAI

prismOpenAI returns a proxied client that traces chat.completions.create calls:

import { AuixPrism, prismOpenAI } from "@auix/prism";
import OpenAI from "openai";

const tracer = new AuixPrism({ apiKey: "your-api-key" });
const client = prismOpenAI(tracer, new OpenAI(), {
  name: "my-chat",
  tags: ["production"],
});

const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});

Anthropic

prismAnthropic returns a proxied client that traces messages.create calls:

import { AuixPrism, prismAnthropic } from "@auix/prism";
import Anthropic from "@anthropic-ai/sdk";

const tracer = new AuixPrism({ apiKey: "your-api-key" });
const client = prismAnthropic(tracer, new Anthropic(), {
  name: "my-chat",
  tags: ["production"],
});

const response = await client.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

Manual Tracing

For full control over trace structure:

import { AuixPrism } from "@auix/prism";

const tracer = new AuixPrism({ apiKey: "your-api-key" });

const trace = tracer.startTrace({
  name: "rag-pipeline",
  metadata: { userId: "u_123" },
});

const span = trace.startSpan({
  name: "vector-search",
  type: "retrieval",
  input: { query: "How do I reset my password?" },
});

span.end({ output: { results: 5 }, status: "completed" });

trace.end({ output: "Here's how to reset...", status: "completed" });

await tracer.destroy();

API

new AuixPrism(config)

| Option | Type | Default | Description | |--------|------|---------|-------------| | apiKey | string | required | Your auix API key | | baseUrl | string | "https://api.auix.dev" | API endpoint | | sessionId | string | — | Group traces under a session | | transport | (events) => Promise<void> | — | Custom transport (bypasses HTTP) | | onFlushError | (error) => void | — | Called when flush fails after retry |

prismAISDK(tracer, model, options?)

Returns { model, end }. The model is a wrapped AI SDK model. Call end(opts?) after the generation completes to finalize the root trace.

| Option | Type | Description | |--------|------|-------------| | name | string | Trace name (defaults to model ID) | | tags | string[] | Tags for filtering | | metadata | Record<string, unknown> | Arbitrary metadata | | endUserId | string | End user identifier |

prismOpenAI(tracer, client, options?)

Returns a proxied OpenAI client. Same options as prismAISDK plus parentTraceId for nesting under an existing trace.

prismAnthropic(tracer, client, options?)

Returns a proxied Anthropic client. Same options as prismOpenAI.

tracer.startTrace(options)

Returns a TraceHandle for manual tracing.

trace.startSpan(options)

Creates a child span. type can be "llm", "tool", "retrieval", or "custom".

tracer.destroy()

Flushes pending events and stops the background timer. Call this when done tracing.

License

Proprietary — see LICENSE.