@tozil/core
v0.1.1
Published
Core engine for Tozil — event buffering, context propagation, and plugin system
Maintainers
Readme
@tozil/core
Core engine for Tozil — know what every AI user costs you.
Handles event buffering, context propagation, and the plugin system. Provider-specific instrumentation is in separate packages.
Install
npm install @tozil/coreUsage
import { init } from "@tozil/core";
import { anthropic } from "@tozil/anthropic";
import { openai } from "@tozil/openai";
init({
apiKey: "tz_...", // or set TOZIL_API_KEY env var
instrumentations: [anthropic(), openai()],
});Every AI call in your app is now tracked. No code changes needed beyond this.
What it does
- Buffers usage events in memory, flushes every 5 seconds
- Propagates user/endpoint context via
AsyncLocalStorage - Provides a plugin interface (
Instrumentation) for provider packages - Never throws — if tracking fails, your app keeps running
- Sends only metadata (model, tokens, latency) — no prompts or completions
API
init(options)
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| apiKey | string | process.env.TOZIL_API_KEY | Your Tozil API key |
| baseUrl | string | https://app.tozil.dev/api/v1 | API endpoint |
| instrumentations | Instrumentation[] | [] | Provider plugins to activate |
| flushInterval | number | 5000 | Flush interval in ms |
| maxBatchSize | number | 100 | Max events per flush |
| debug | boolean | false | Log tracking events to console |
withContext(ctx, fn)
Run a function with user/endpoint context:
import { withContext } from "@tozil/core";
withContext({ userId: "user_123", endpoint: "/api/chat" }, async () => {
// AI calls in here are tagged with this user and endpoint
await anthropic.messages.create({ ... });
});shutdown()
Flush remaining events and clean up. Call on process exit.
Provider packages
| Package | Provider |
|---------|----------|
| @tozil/anthropic | Anthropic (Claude) |
| @tozil/openai | OpenAI (GPT, o1) |
| @tozil/google-ai | Google Generative AI (Gemini) |
| @tozil/vercel-ai | Vercel AI SDK |
| @tozil/langchain | LangChain / LangGraph |
License
MIT
