npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

brokle

v0.3.0

Published

OpenTelemetry-native observability SDK for AI applications

Readme

Brokle SDK for TypeScript/JavaScript

OpenTelemetry-native observability SDK for AI applications. Track, monitor, and optimize your LLM applications with industry-standard OTLP traces.

Features

  • OTEL-Native: Built on OpenTelemetry SDK (industry standard)
  • GenAI 1.28+ Compliant: Full support for OTEL GenAI semantic conventions
  • Trace-Level Sampling: Deterministic sampling (no partial traces)
  • Type-Safe: Complete TypeScript type definitions
  • Zero Config: Works out of the box with environment variables
  • Gzip Compression: Automatic bandwidth optimization
  • Dual Build: ESM + CJS support

Installation

npm install brokle @opentelemetry/api
# or
pnpm add brokle @opentelemetry/api
# or
yarn add brokle @opentelemetry/api

Quick Start

1. Initialize the SDK

import { getClient } from 'brokle';

// Option 1: From environment variables
// Set BROKLE_API_KEY=bk_... in your environment
const client = getClient();

// Option 2: Direct configuration
const client = getClient({
  apiKey: 'bk_...',
  baseUrl: 'https://api.brokle.ai',
  environment: 'production',
});

2. Trace Your Code

Using Decorators (Recommended)

import { observe } from 'brokle';

class AIService {
  @observe({ name: 'chat-completion', asType: 'generation' })
  async chat(prompt: string): Promise<string> {
    const response = await openai.chat.completions.create({
      model: 'gpt-4',
      messages: [{ role: 'user', content: prompt }],
    });
    return response.choices[0].message.content;
  }
}

Using Traced Function

import { getClient, Attrs } from 'brokle';

const client = getClient();

await client.traced('my-operation', async (span) => {
  span.setAttribute(Attrs.USER_ID, 'user-123');
  span.setAttribute('custom-attr', 'value');

  const result = await doWork();

  return result;
});

Using Generation Helper

import { getClient, Attrs } from 'brokle';

const client = getClient();

const response = await client.generation('chat', 'gpt-4', 'openai', async (span) => {
  const completion = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [{ role: 'user', content: 'Hello' }],
  });

  // Capture GenAI attributes
  span.setAttribute(Attrs.GEN_AI_INPUT_MESSAGES, JSON.stringify([...]));
  span.setAttribute(Attrs.GEN_AI_OUTPUT_MESSAGES, JSON.stringify([...]));
  span.setAttribute(Attrs.GEN_AI_USAGE_INPUT_TOKENS, completion.usage.prompt_tokens);
  span.setAttribute(Attrs.GEN_AI_USAGE_OUTPUT_TOKENS, completion.usage.completion_tokens);

  return completion;
});

Configuration

Environment Variables

# Required
BROKLE_API_KEY=bk_your_api_key_here

# Optional
BROKLE_BASE_URL=https://api.brokle.ai  # Default: http://localhost:8080
BROKLE_ENVIRONMENT=production           # Default: default
BROKLE_DEBUG=true                       # Default: false
BROKLE_SAMPLE_RATE=0.5                  # Default: 1.0 (0.0-1.0)
BROKLE_FLUSH_SYNC=true                  # Default: false (use for serverless)
BROKLE_FLUSH_AT=100                     # Default: 100 (batch size)
BROKLE_FLUSH_INTERVAL=10                # Default: 10 seconds

Programmatic Configuration

import { getClient } from 'brokle';

const client = getClient({
  apiKey: 'bk_...',
  baseUrl: 'https://api.brokle.ai',
  environment: 'production',
  debug: false,
  sampleRate: 1.0,           // Trace-level sampling
  flushSync: false,          // Set true for serverless
  flushAt: 100,             // Batch size
  flushInterval: 10,        // Flush interval (seconds)
  maxQueueSize: 10000,      // Max queue size
  timeout: 30000,           // Request timeout (ms)
});

Prompt Management

Brokle provides centralized prompt storage with versioning, labels, and caching.

import { getClient } from 'brokle';

const client = getClient();

// Fetch a prompt
const prompt = await client.prompts.get("greeting", {
  label: "production"
});

// Compile with variables
const compiled = prompt.compile({ name: "Alice" });

// Convert to OpenAI format
const messages = prompt.toOpenAIMessages({ name: "Alice" });

// Convert to Anthropic format
const anthropicRequest = prompt.toAnthropicRequest({ name: "Alice" });

// List all prompts
const { data, pagination } = await client.prompts.list({
  type: "chat",
  limit: 10
});

data.forEach(p => {
  console.log(`${p.name} v${p.version} (${p.type})`);
});

// Cache management
client.prompts.invalidate("greeting");  // Invalidate specific prompt
client.prompts.clearCache();            // Clear all cached prompts
const stats = client.prompts.getCacheStats();  // Get cache stats

Advanced Usage

Manual Span Control

import { getClient } from 'brokle';

const client = getClient();
const tracer = client.getTracer();

// Start a manual span
const span = tracer.startSpan('my-span', {
  attributes: {
    'custom.attribute': 'value',
  },
});

try {
  // Do work
  await doWork();

  // Set success status
  span.setStatus({ code: SpanStatusCode.OK });
} catch (error) {
  // Record exception
  span.recordException(error);
  span.setStatus({
    code: SpanStatusCode.ERROR,
    message: error.message,
  });
  throw error;
} finally {
  // Always end the span
  span.end();
}

Trace Function Wrapper

import { traceFunction, Attrs } from 'brokle';

const tracedProcess = traceFunction(
  'process-data',
  async (data: any) => {
    return processData(data);
  },
  {
    captureInput: true,
    captureOutput: true,
    tags: ['critical', 'production'],
    userId: 'user-123',
  }
);

// Use it
const result = await tracedProcess(myData);

Serverless/Lambda Setup

import { getClient } from 'brokle';

// Initialize once (outside handler)
const client = getClient({
  flushSync: true,  // Use SimpleSpanProcessor for immediate export
});

export const handler = async (event: any) => {
  await client.traced('lambda-handler', async (span) => {
    span.setAttribute('event.type', event.type);

    const result = await processEvent(event);

    return result;
  });

  // Force flush before exit
  await client.flush();

  return { statusCode: 200 };
};

Long-Running Application Shutdown

import { getClient } from 'brokle';

const client = getClient();

// Graceful shutdown
process.on('SIGTERM', async () => {
  console.log('Shutting down...');
  await client.shutdown();
  process.exit(0);
});

Type-Safe Attributes

Use the Attrs constant for type-safe attribute keys:

import { Attrs, LLMProvider } from 'brokle';

span.setAttribute(Attrs.GEN_AI_PROVIDER_NAME, LLMProvider.OPENAI);
span.setAttribute(Attrs.GEN_AI_REQUEST_MODEL, 'gpt-4');
span.setAttribute(Attrs.GEN_AI_OPERATION_NAME, 'chat');
span.setAttribute(Attrs.USER_ID, 'user-123');
span.setAttribute(Attrs.SESSION_ID, 'session-456');
span.setAttribute(Attrs.TAGS, JSON.stringify(['production', 'critical']));

Integration Packages

For automatic tracing of popular SDKs:

  • brokle-openai - OpenAI SDK wrapper (Proxy-based, zero code changes)
  • brokle-anthropic - Anthropic SDK wrapper
  • brokle-langchain - LangChain.js callbacks

Examples

See the examples directory for complete working examples:

  • basic-usage.ts - Simple tracing patterns
  • wrapper-usage.ts - SDK wrappers (OpenAI, Anthropic)
  • langchain-integration.ts - LangChain.js integration
  • nextjs-app/ - Next.js application example

Architecture

OTEL-Native Design

TypeScript SDK → OpenTelemetry SDK → OTLP/HTTP → Brokle Backend
                 (Industry Standard)  (Protobuf+Gzip)

Key Components

  • BrokleClient: Main SDK class with TracerProvider
  • BrokleSpanProcessor: Wrapper around BatchSpanProcessor/SimpleSpanProcessor
  • BrokleExporter: OTLP exporter with Gzip compression and API key auth
  • Symbol Singleton: Global state management

Trace-Level Sampling

Uses TraceIdRatioBasedSampler for deterministic sampling:

  • Entire traces sampled together (no partial traces)
  • Sampling decision based on trace ID
  • Consistent across distributed systems

Requirements

  • Node.js >= 18.0.0
  • TypeScript >= 5.0 (for decorators)

License

MIT

Support

  • Documentation: https://docs.brokle.ai
  • GitHub: https://github.com/brokle-ai/brokle-js
  • Issues: https://github.com/brokle-ai/brokle-js/issues