@tozil/sdk
v0.1.2
Published
Know what every AI user costs you — track AI costs per user with 2 lines of code
Downloads
28
Maintainers
Readme
@tozil/sdk
Know what every AI user costs you. Track AI costs per user, per model, per endpoint — with 2 lines of code.
Install
npm install @tozil/sdkQuick Start
import tozil from "@tozil/sdk";
tozil.init(); // reads TOZIL_API_KEY from envThat's it. Tozil automatically patches the Anthropic and OpenAI SDKs to track every API call — tokens, latency, model, and cost.
Setup
- Sign up at tozil.dev and grab your API key
- Set your environment variable:
TOZIL_API_KEY=tz_your_key_here- Call
tozil.init()before creating any Anthropic/OpenAI clients:
import tozil from "@tozil/sdk";
import Anthropic from "@anthropic-ai/sdk";
tozil.init();
const client = new Anthropic();
const response = await client.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello" }],
});
// ^ This call is automatically trackedTrack Users & Endpoints
Use the Express middleware to automatically tag requests with user and endpoint info:
import tozil from "@tozil/sdk";
import { tozilMiddleware } from "@tozil/sdk/express";
import express from "express";
tozil.init();
const app = express();
app.use(tozilMiddleware({
getUserId: (req) => req.headers["x-user-id"] as string,
}));Or set context manually anywhere in your code:
import { withContext } from "@tozil/sdk";
await withContext({ userId: "user_123", endpoint: "/chat" }, async () => {
// All AI calls inside here are tagged with this user & endpoint
await client.messages.create({ ... });
});Options
tozil.init({
apiKey: "tz_...", // default: process.env.TOZIL_API_KEY
baseUrl: "https://...", // default: https://app.tozil.dev/api/v1
flushInterval: 5000, // ms between flushes (default: 5000)
maxBatchSize: 100, // events per batch (default: 100)
debug: false, // log tracking events to console
});How It Works
- Monkey-patches Anthropic and OpenAI SDK prototypes at init time
- Uses Node.js AsyncLocalStorage for zero-config context propagation
- Buffers events in memory, flushes every 5s (or at batch size)
- Silent failure — never throws, never blocks your critical path
- Zero runtime dependencies
Requirements
- Node.js >= 18
@anthropic-ai/sdkand/oropenaias peer dependencies (install whichever you use)
License
MIT
