npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

observ-sdk

v0.1.6

Published

Infrastructure for cheaper, faster and more reliable LLM calls.

Readme

Observ SDK

AI tracing and semantic caching SDK for Observ.

Installation

npm install observ-sdk

Install provider-specific SDKs as needed:

# For Anthropic
npm install @anthropic-ai/sdk

# For OpenAI
npm install openai

# For Mistral
npm install @mistralai/mistralai

# For Vercel AI SDK (recommended for multi-provider support)
npm install ai

Quick Start

Anthropic

import Anthropic from "@anthropic-ai/sdk";
import Observ from "observ-sdk";

const ob = new Observ({
  apiKey: "your-observ-api-key",
  recall: true, // Enable semantic caching
});

const client = new Anthropic({ apiKey: "your-anthropic-key" });
const wrappedClient = ob.anthropic(client);

// Use normally - all calls are automatically traced
const response = await wrappedClient.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

OpenAI

import OpenAI from "openai";
import Observ from "observ-sdk";

const ob = new Observ({
  apiKey: "your-observ-api-key",
  recall: true,
});

const client = new OpenAI({ apiKey: "your-openai-key" });
const wrappedClient = ob.openai(client);

const response = await wrappedClient.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: "Hello!" }],
});

Mistral

import { Mistral } from "@mistralai/mistralai";
import Observ from "observ-sdk";

const ob = new Observ({
  apiKey: "your-observ-api-key",
  recall: true,
});

const client = new Mistral({ apiKey: "your-mistral-key" });
const wrappedClient = ob.mistral(client);

const response = await wrappedClient.chat.completions.create({
  model: "mistral-large-latest",
  messages: [{ role: "user", content: "Hello!" }],
});

Vercel AI SDK (Recommended)

The Vercel AI SDK integration provides the most flexible way to use Observ with 25+ AI providers through a unified API.

npm install ai @ai-sdk/openai @ai-sdk/anthropic
import { Observ } from "observ-sdk";
import { openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { generateText, streamText } from "ai";

const observ = new Observ({
  apiKey: "your-observ-api-key",
  recall: true, // Enable semantic caching
});

// Wrap any Vercel AI SDK model
const model = observ.wrap(openai("gpt-4"));

// Use with generateText
const result = await generateText({
  model,
  prompt: "What is TypeScript?",
});

// Streaming works automatically
const stream = await streamText({
  model,
  prompt: "Write a haiku about coding",
});

for await (const chunk of stream.textStream) {
  process.stdout.write(chunk);
}

// Add metadata for better observability
const result2 = await generateText({
  model,
  prompt: "Explain async/await",
  providerOptions: {
    observ: {
      metadata: { user_id: "123", topic: "javascript" },
      sessionId: "session-abc",
    },
  },
});

Benefits of Vercel AI SDK integration:

  • ✅ Works with 25+ providers (OpenAI, Anthropic, Google, Mistral, Cohere, etc.)
  • ✅ Supports streaming, structured outputs, and tool calling
  • ✅ Semantic caching works across all providers
  • ✅ Unified API - switch providers without code changes
  • ✅ Built-in type safety

Configuration

const ob = new Observ({
  apiKey: "your-observ-api-key", // Required
  recall: true, // Enable semantic caching (default: false)
  environment: "production", // Environment tag (default: "production")
  endpoint: "https://api.example.com", // Custom endpoint (optional)
  debug: false, // Enable debug logging (default: false)
});

Features

  • Automatic Tracing: All LLM calls are automatically traced
  • Semantic Caching: Cache similar prompts to reduce costs and latency
  • Multi-Provider: Support for Anthropic, OpenAI, Mistral, xAI, OpenRouter, and 25+ providers via Vercel AI SDK
  • Vercel AI SDK Integration: Unified API for all major LLM providers with full streaming and tool calling support
  • Session Tracking: Group related calls with session IDs
  • Metadata: Attach custom metadata to traces

Metadata & Sessions

// Add metadata to a request
const response = await wrappedClient.messages
  .withMetadata({ user_id: "123", feature: "chat" })
  .create({
    model: "claude-sonnet-4-20250514",
    messages: [{ role: "user", content: "Hello!" }],
  });

// Track conversation sessions
const response = await wrappedClient.messages
  .withSessionId("conversation-abc")
  .create({
    model: "claude-sonnet-4-20250514",
    messages: [{ role: "user", content: "Hello!" }],
  });

License

MIT