npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@haro/aitracer

v1.0.0

Published

TypeScript SDK for AITracer - AI/LLM monitoring and observability

Readme

@haro/aitracer

TypeScript SDK for AITracer - AI/LLM monitoring and observability platform.

Installation

npm install @haro/aitracer
# or
yarn add @haro/aitracer
# or
pnpm add @haro/aitracer

Quick Start

import { AITracer, wrapOpenAI } from "@haro/aitracer";
import OpenAI from "openai";

// Initialize AITracer
const tracer = new AITracer({
  apiKey: process.env.AITRACER_API_KEY,
  projectId: "your-project-id", // optional
});

// Wrap your OpenAI client
const openai = wrapOpenAI(new OpenAI(), tracer);

// All API calls are now automatically logged
const response = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});

Supported Providers

OpenAI

import { AITracer, wrapOpenAI } from "@haro/aitracer";
import OpenAI from "openai";

const tracer = new AITracer({ apiKey: "your-aitracer-key" });
const openai = wrapOpenAI(new OpenAI(), tracer);

// Streaming is also supported
const stream = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content || "");
}

Anthropic

import { AITracer, wrapAnthropic } from "@haro/aitracer";
import Anthropic from "@anthropic-ai/sdk";

const tracer = new AITracer({ apiKey: "your-aitracer-key" });
const anthropic = wrapAnthropic(new Anthropic(), tracer);

const response = await anthropic.messages.create({
  model: "claude-3-5-sonnet-20241022",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

Google Gemini

import { AITracer, wrapGemini } from "@haro/aitracer";
import { GoogleGenerativeAI } from "@google/generative-ai";

const tracer = new AITracer({ apiKey: "your-aitracer-key" });
const genAI = new GoogleGenerativeAI(process.env.GEMINI_API_KEY);
const model = wrapGemini(
  genAI.getGenerativeModel({ model: "gemini-1.5-flash" }),
  tracer
);

const result = await model.generateContent("Hello!");
console.log(result.response.text());

Manual Logging

You can also log requests manually without using wrappers:

import { AITracer } from "@haro/aitracer";

const tracer = new AITracer({ apiKey: "your-aitracer-key" });

// Log a single request
await tracer.log({
  model: "gpt-4o",
  provider: "openai",
  inputData: { messages: [{ role: "user", content: "Hello!" }] },
  outputData: { content: "Hi there!" },
  inputTokens: 10,
  outputTokens: 5,
  latencyMs: 500,
  status: "success",
});

// Log multiple requests in a batch
await tracer.logBatch([
  {
    model: "gpt-4o",
    provider: "openai",
    inputData: { messages: [{ role: "user", content: "Hello!" }] },
    outputData: { content: "Hi!" },
    inputTokens: 10,
    outputTokens: 3,
    latencyMs: 400,
  },
  {
    model: "claude-3-5-sonnet",
    provider: "anthropic",
    inputData: { messages: [{ role: "user", content: "Hi!" }] },
    outputData: { content: "Hello!" },
    inputTokens: 8,
    outputTokens: 4,
    latencyMs: 350,
  },
]);

Configuration

const tracer = new AITracer({
  // Required
  apiKey: "your-aitracer-key",

  // Optional
  projectId: "your-project-id", // Default project for all logs
  baseUrl: "https://api.aitracer.io", // Custom API URL
  debug: false, // Enable debug logging
  timeout: 30000, // Request timeout in ms
  asyncMode: true, // Use async queue (non-blocking)
  flushInterval: 5000, // Queue flush interval in ms
  maxQueueSize: 100, // Max queue size before auto-flush
});

Async Mode

By default, logs are queued and sent asynchronously to avoid blocking your application:

const tracer = new AITracer({
  apiKey: "your-aitracer-key",
  asyncMode: true, // default
  flushInterval: 5000, // flush every 5 seconds
  maxQueueSize: 100, // or when queue reaches 100 logs
});

// Logs are queued (non-blocking)
tracer.log({ ... });

// Manually flush the queue
await tracer.flush();

// Shutdown and flush remaining logs
await tracer.shutdown();

For synchronous logging, set asyncMode: false:

const tracer = new AITracer({
  apiKey: "your-aitracer-key",
  asyncMode: false,
});

// Logs are sent immediately (blocking)
await tracer.log({ ... });

Tracing

Add trace context to correlate related requests:

const traceId = crypto.randomUUID();

await tracer.log({
  model: "gpt-4o",
  provider: "openai",
  inputData: { messages: [{ role: "user", content: "Hello!" }] },
  outputData: { content: "Hi!" },
  traceId,
  spanId: crypto.randomUUID(),
  sessionId: "user-session-123",
  userId: "user-456",
});

Custom Tags and Metadata

Add custom tags and metadata for filtering and analysis:

await tracer.log({
  model: "gpt-4o",
  provider: "openai",
  inputData: { messages: [{ role: "user", content: "Hello!" }] },
  outputData: { content: "Hi!" },
  tags: {
    environment: "production",
    feature: "chatbot",
  },
  metadata: {
    customerId: "cust-123",
    requestSource: "mobile-app",
  },
});

Error Handling

import { AITracer, AITracerError } from "@haro/aitracer";

try {
  await tracer.log({ ... });
} catch (error) {
  if (error instanceof AITracerError) {
    console.error(`AITracer error: ${error.message}`);
    console.error(`Status code: ${error.statusCode}`);
  }
}

TypeScript Support

This SDK is written in TypeScript and includes full type definitions:

import type { LogData, LogResponse, AITracerConfig } from "@haro/aitracer";

const config: AITracerConfig = {
  apiKey: "your-key",
  debug: true,
};

const logData: LogData = {
  model: "gpt-4o",
  provider: "openai",
  inputData: { messages: [] },
};

Requirements

  • Node.js 18.0.0 or later
  • TypeScript 5.0 or later (for TypeScript users)

License

MIT