npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@sideseat/sdk

v1.0.8

Published

SideSeat SDK for AI observability

Readme

SideSeat TypeScript SDK

AI Development Workbench — Debug, trace, and understand your AI agents.

npm Node 18+ License: MIT

Table of Contents

What is SideSeat?

AI agents are hard to debug. Requests fly by, context builds up, and when something fails you're left guessing.

SideSeat captures every LLM call, tool call, and agent decision, then displays them in a web UI as they happen. Run it locally during development, or deploy to your private cloud for team visibility.

Built on OpenTelemetry — the open standard for observability.

Features:

  • Real-time tracing — Watch LLM requests and tool calls as they happen
  • Message threading — See full conversations, tool calls, and images
  • Cost tracking — Automatic token counting and cost calculation

Supported frameworks: Vercel AI SDK, Strands (TypeScript), and any framework emitting OpenTelemetry traces

Quick Start

Requirements: Node.js 18+

1. Start the server

npx sideseat

2. Install and initialize

npm install ai @ai-sdk/amazon-bedrock @sideseat/sdk
import { init, Frameworks } from '@sideseat/sdk';
import { generateText } from 'ai';
import { bedrock } from '@ai-sdk/amazon-bedrock';

init({ framework: Frameworks.VercelAI });

const { text } = await generateText({
  model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
  prompt: 'What is 2+2?',
  experimental_telemetry: { isEnabled: true },
});

console.log(text);

3. View traces

Open localhost:5388 and run your agent. Traces appear in real time.

Installation

npm install @sideseat/sdk

Framework Examples

Strands (TypeScript)

npm install @strands-agents/sdk @sideseat/sdk
import { init, Frameworks } from '@sideseat/sdk';
import { Agent } from '@strands-agents/sdk';

init({ framework: Frameworks.Strands });

const agent = new Agent({ model: 'global.anthropic.claude-haiku-4-5-20251001-v1:0' });
const result = await agent.invoke('What is 2+2?');
console.log(result.toString());

Vercel AI SDK

Vercel AI SDK has built-in OpenTelemetry support via experimental_telemetry. Enable it on each call:

import { init, shutdown, Frameworks } from '@sideseat/sdk';
import { generateText, generateObject, tool } from 'ai';
import { bedrock } from '@ai-sdk/amazon-bedrock';
import { z } from 'zod';

init({ framework: Frameworks.VercelAI });

// Text generation
const { text } = await generateText({
  model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
  prompt: 'What is the capital of France?',
  experimental_telemetry: { isEnabled: true },
});

// Structured output
const { object } = await generateObject({
  model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
  schema: z.object({ name: z.string(), age: z.number() }),
  prompt: 'Generate a person',
  experimental_telemetry: { isEnabled: true },
});

// Tool use
const weatherTool = tool({
  description: 'Get weather for a city',
  parameters: z.object({ city: z.string() }),
  execute: async ({ city }) => ({ temp: 72, condition: 'sunny' }),
});

const { text: weatherText } = await generateText({
  model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
  tools: { weather: weatherTool },
  prompt: 'What is the weather in Paris?',
  experimental_telemetry: { isEnabled: true },
});

// Flush traces before exit
await shutdown();

Important: Always include experimental_telemetry: { isEnabled: true } on each generateText, generateObject, or streamText call.

Without SideSeat SDK

Manual OpenTelemetry setup for full control:

import { NodeSDK } from '@opentelemetry/sdk-node';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';

const sdk = new NodeSDK({ traceExporter: new OTLPTraceExporter() });
sdk.start();

import { generateText } from 'ai';
import { bedrock } from '@ai-sdk/amazon-bedrock';

const { text } = await generateText({
  model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
  prompt: 'What is 2+2?',
  experimental_telemetry: { isEnabled: true },
});

console.log(text);

Set the endpoint:

export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5388/otel/default

Configuration

Environment Variables

| Variable | Default | Description | | --------------------- | ----------------------- | ---------------------------------------------- | | SIDESEAT_ENDPOINT | http://127.0.0.1:5388 | Server URL | | SIDESEAT_PROJECT_ID | default | Project identifier | | SIDESEAT_API_KEY | — | Authentication key | | SIDESEAT_DISABLED | false | Disable all telemetry | | SIDESEAT_DEBUG | false | Enable verbose logging | | SIDESEAT_LOG_LEVEL | none | Log level (none/error/warn/info/debug/verbose) |

Constructor Options

init({
  endpoint: 'http://localhost:5388',
  projectId: 'my-project',
  apiKey: 'pk-...',
  framework: Frameworks.VercelAI,
  serviceName: 'my-app',
  serviceVersion: '1.0.0',
  enableTraces: true,
  logLevel: 'debug',
  disabled: false,
  debug: false,
});

| Parameter | Type | Default | Description | | ---------------- | ---------- | ----------------------- | -------------------------- | | endpoint | string | http://127.0.0.1:5388 | Server URL | | projectId | string | default | Project identifier | | apiKey | string | undefined | Authentication key | | framework | string | — | Framework identifier (required) | | serviceName | string | npm_package_name | Application name in traces | | serviceVersion | string | npm_package_version | Application version | | enableTraces | boolean | true | Export trace spans | | logLevel | LogLevel | none | OpenTelemetry log level | | disabled | boolean | false | Disable all telemetry | | debug | boolean | false | Enable verbose logging |

Resolution order: Constructor → SIDESEAT_* env → OTEL_* env → defaults

Advanced Usage

Async Initialization

Use createClient() for async initialization with connection validation:

import { createClient } from '@sideseat/sdk';

const client = await createClient({ projectId: 'my-project' });
// Connection validated before returning

Global Instance

import { init, getClient, shutdown, isInitialized } from '@sideseat/sdk';

init({ framework: Frameworks.VercelAI, projectId: 'my-project' }); // Initialize once
const client = getClient(); // Access anywhere
await shutdown(); // Clean up

Custom Spans

const client = init({ framework: Frameworks.VercelAI });

// Async spans
const result = await client.span('process-request', async (span) => {
  span.setAttribute('user_id', '12345');
  return await doWork();
});

// Sync spans
const value = client.spanSync('compute', (span) => {
  span.setAttribute('input', 42);
  return calculate();
});
// Exceptions recorded automatically with stack traces

Debug Exporters

const client = init({ framework: Frameworks.VercelAI });
client.setupConsoleExporter(); // Print to stdout
client.setupFileExporter('traces.jsonl'); // Write to file

Disabled Mode

init({ framework: Frameworks.VercelAI, disabled: true }); // Or set SIDESEAT_DISABLED=true

Existing OpenTelemetry Setup

If a TracerProvider already exists, SideSeat adds its exporter to the existing provider.

Direct Class Usage

For multiple independent instances:

import { SideSeat } from '@sideseat/sdk';

const client1 = new SideSeat({ projectId: 'project-a' });
const client2 = new SideSeat({ projectId: 'project-b' });

Data and Privacy

What is collected:

  • Trace spans with timing and hierarchy
  • LLM prompts and responses
  • Token counts and model names
  • Errors and stack traces

Where it goes:

All data is sent to your self-hosted server. Nothing leaves your infrastructure.

Resilience:

  • Up to 2,048 spans buffered in memory
  • Batched exports every 5 seconds
  • 30-second timeout per export
  • Server downtime does not affect your application

Troubleshooting

| Problem | Solution | | ------------------ | ---------------------------------------------------------- | | Connection refused | Server not running. Run npx sideseat | | No traces appear | Check experimental_telemetry: { isEnabled: true } is set | | Duplicate traces | Initialize init() once per process | | Import errors | Ensure Node.js 18+ and ESM/CJS compatibility |

API Reference

Module Functions

| Function | Returns | Description | | ------------------------ | ------------------- | ------------------------------ | | init(options?) | SideSeat | Create global instance (sync) | | createClient(options?) | Promise<SideSeat> | Create global instance (async) | | getClient() | SideSeat | Get global instance | | shutdown() | Promise<void> | Shut down global instance | | isInitialized() | boolean | Check if initialized |

SideSeat Class

const client = new SideSeat(options);

Properties:

| Name | Type | Description | | ---------------- | -------------------- | ----------------------------- | | config | Config | Immutable configuration | | tracerProvider | NodeTracerProvider | OpenTelemetry tracer provider | | isDisabled | boolean | Whether telemetry is disabled | | isReady | boolean | Whether client is ready |

Methods:

| Name | Returns | Description | | -------------------------------- | ------------------ | --------------------------------- | | span(name, fn) | Promise<T> | Create an async span | | spanSync(name, fn) | T | Create a sync span | | getTracer(name?, version?) | Tracer | Get an OpenTelemetry tracer | | forceFlush(timeoutMs?) | Promise<boolean> | Export pending spans immediately | | validateConnection(timeoutMs?) | Promise<boolean> | Test server connectivity | | shutdown(timeoutMs?) | Promise<void> | Flush pending spans and shut down | | setupConsoleExporter() | this | Add console exporter | | setupFileExporter(path?) | this | Add JSONL file exporter | | addSpanProcessor(processor) | this | Add custom span processor |

Frameworks

Frameworks.Strands      // "strands"
Frameworks.VercelAI    // "vercel-ai"
Frameworks.LangChain   // "langchain"
Frameworks.CrewAI      // "crewai"
Frameworks.AutoGen     // "autogen"
Frameworks.OpenAIAgents // "openai-agents"
Frameworks.GoogleADK   // "google-adk"
Frameworks.PydanticAI  // "pydantic-ai"

Utilities

| Export | Description | | ---------------------- | -------------------------------------- | | encodeValue(value) | JSON-encode a value; base64 for binary | | spanToDict(span) | Convert span to dictionary | | JsonFileSpanExporter | JSONL file exporter class | | SideSeatError | SDK error class | | VERSION | SDK version string |

Resources

License

MIT