npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

brokle-anthropic

v0.3.0

Published

Anthropic SDK wrapper with automatic Brokle tracing

Readme

Brokle Anthropic Wrapper

Automatic tracing for Anthropic (Claude) SDK with zero code changes. Drop-in replacement that adds comprehensive observability to your Anthropic API calls.

Features

  • Zero Code Changes: Wrap your Anthropic client once, trace all calls automatically
  • Full GenAI Compliance: OTEL GenAI 1.28+ semantic conventions
  • Complete Coverage: Messages API (Claude)
  • Automatic Attribute Extraction: Model, tokens, latency, messages, stop reason
  • Proxy Pattern: Non-invasive, preserves all Anthropic functionality
  • TypeScript Native: Full type safety maintained

Installation

npm install brokle brokle-anthropic @anthropic-ai/sdk @opentelemetry/api

Quick Start

import Anthropic from '@anthropic-ai/sdk';
import { wrapAnthropic } from 'brokle-anthropic';
import { getClient } from 'brokle';

// 1. Initialize Brokle
const brokleClient = getClient({
  apiKey: process.env.BROKLE_API_KEY,
});

// 2. Create Anthropic client
const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

// 3. Wrap it with Brokle
const tracedAnthropic = wrapAnthropic(anthropic);

// 4. Use normally - automatic tracing!
const response = await tracedAnthropic.messages.create({
  model: 'claude-3-opus-20240229',
  max_tokens: 1024,
  messages: [
    { role: 'user', content: 'What is the capital of France?' }
  ],
});

console.log(response.content[0].text);
// Trace automatically sent to Brokle backend with all metadata

What Gets Traced

Automatic Attribute Extraction

Every Anthropic API call is automatically traced with:

Request Attributes

  • gen_ai.provider.name = "anthropic"
  • gen_ai.operation.name = "chat"
  • gen_ai.request.model = Model name (e.g., "claude-3-opus-20240229")
  • gen_ai.request.max_tokens = Max tokens
  • gen_ai.request.temperature = Temperature parameter
  • gen_ai.request.top_p = Top P
  • anthropic.request.top_k = Top K (Anthropic-specific)
  • gen_ai.input.messages = JSON array of messages
  • gen_ai.system_instructions = System prompt (if provided)
  • brokle.streaming = Whether streaming is enabled

Response Attributes

  • gen_ai.response.id = Response ID
  • gen_ai.response.model = Actual model used
  • gen_ai.response.finish_reasons = Stop reason array
  • gen_ai.output.messages = JSON array of response content
  • gen_ai.usage.input_tokens = Input tokens
  • gen_ai.usage.output_tokens = Output tokens
  • brokle.usage.total_tokens = Total tokens
  • brokle.usage.latency_ms = Request latency

Supported APIs

Messages API (Claude)

const response = await tracedAnthropic.messages.create({
  model: 'claude-3-opus-20240229',
  max_tokens: 1024,
  messages: [
    { role: 'user', content: 'Hello, Claude!' }
  ],
});

Span Name: chat {model} Span Type: generation

With System Prompt

const response = await tracedAnthropic.messages.create({
  model: 'claude-3-sonnet-20240229',
  max_tokens: 1024,
  system: 'You are a helpful AI assistant specialized in mathematics.',
  messages: [
    { role: 'user', content: 'What is 2+2?' }
  ],
});

System prompts are captured in gen_ai.system_instructions attribute.

With Tool Use

const response = await tracedAnthropic.messages.create({
  model: 'claude-3-opus-20240229',
  max_tokens: 1024,
  tools: [
    {
      name: 'get_weather',
      description: 'Get current weather',
      input_schema: {
        type: 'object',
        properties: {
          location: { type: 'string' },
        },
      },
    },
  ],
  messages: [
    { role: 'user', content: 'What is the weather in Paris?' }
  ],
});

Tool use is automatically captured in the output messages.

Advanced Usage

With User/Session Context

import { Attrs } from 'brokle';

const brokle = getClient();
await brokle.traced('user-query', async (span) => {
  span.setAttribute(Attrs.USER_ID, 'user-123');
  span.setAttribute(Attrs.SESSION_ID, 'session-456');

  const response = await tracedAnthropic.messages.create({
    model: 'claude-3-opus-20240229',
    max_tokens: 1024,
    messages: [{ role: 'user', content: 'Hello' }],
  });

  return response;
});

Streaming

const stream = await tracedAnthropic.messages.create({
  model: 'claude-3-opus-20240229',
  max_tokens: 1024,
  messages: [{ role: 'user', content: 'Tell me a story' }],
  stream: true,
});

for await (const event of stream) {
  if (event.type === 'content_block_delta') {
    process.stdout.write(event.delta.text);
  }
}

Note: Streaming traces are created when the request starts. Final attributes (tokens, stop reason) may not be available for streaming requests.

How It Works

Proxy Pattern

The wrapper uses JavaScript Proxy to intercept method calls:

wrapAnthropic(client) → Proxy
  ↓
  Intercepts: messages.create
  ↓
  Creates OTEL span → Calls original method → Extracts attributes → Closes span
  ↓
  Returns original response unchanged

Architecture

Your App → wrapAnthropic(client) → Proxy → OTEL Span → Anthropic API
                                    ↓
                           Brokle Client → OTLP → Backend

Supported Models

  • Claude 3 Opus
  • Claude 3 Sonnet
  • Claude 3 Haiku
  • All future Claude models (wrapper is model-agnostic)

Limitations

  • Streaming: Final usage metrics not available for streamed responses
  • Legacy Completions API: Not supported (use Messages API instead)

Requirements

  • Node.js >= 18.0.0
  • Anthropic SDK >= 0.17.0
  • Brokle SDK >= 0.1.0

License

MIT

Links