@spy-llm/sdk
v0.3.9
Published
Framework-agnostic agent observability & LLM tracing. Supports OpenAI, CrewAI, AutoGen, LangGraph, and OpenTelemetry.
Maintainers
Readme
SpyLLM JavaScript/TypeScript SDK
Framework-agnostic agent observability and automatic LLM tracing. Works with OpenAI and any OpenTelemetry-instrumented agent framework.
See it in action — view a live trace on the dashboard
Prerequisites
You need a free SpyLLM account and an API key to use this SDK.
- Sign up at spyllm.dev/sign-up
- Go to Settings → API Keys and click Create API Key
- Copy the key — it is only shown once
Install
npm install @spy-llm/sdkQuick Start
import { init } from "@spy-llm/sdk";
init({ apiKey: "sk-..." });
// That's it. Every OpenAI call is now automatically traced.
import OpenAI from "openai";
const client = new OpenAI();
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
// Prompt, response, tokens, cost, and latency are captured automatically.Open the dashboard to see traces as they arrive.
Agent Observability
Wrap multi-agent workflows with agentSpan() to automatically link every nested LLM call into a trace DAG. Nested spans inherit traceId and set parentSpanId automatically — no manual ID threading needed.
import { init, agentSpan } from "@spy-llm/sdk";
import OpenAI from "openai";
init({ apiKey: "sk-..." });
const client = new OpenAI();
await agentSpan("orchestrator", { role: "orchestrator" }, async () => {
const plan = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Plan the research task" }],
});
await agentSpan("researcher", { role: "worker" }, async () => {
const research = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Research quantum computing" }],
});
});
await agentSpan("writer", { role: "worker" }, async () => {
const report = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Write the report" }],
});
});
});
// All spans share the same traceId.
// Open the dashboard to see the full agent topology as an interactive DAG.Reading Span Context
Access the current span anywhere in your async code:
import { getCurrentSpan } from "@spy-llm/sdk";
const ctx = getCurrentSpan();
if (ctx) {
console.log(`Currently inside: ${ctx.agentName} (trace=${ctx.traceId})`);
}Span Options
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| name | string | required | Human-readable agent name |
| role | string | "worker" | Agent role for topology grouping |
| operation | string | "invoke_agent" | One of: invoke_agent, create_agent, execute_tool, chat |
| traceId | string | auto-inherited | Override trace ID |
| framework | string | undefined | Framework identifier: crewai, autogen, langgraph, custom |
| inputSource | string | undefined | What triggered this span: user, agent:planner, tool:search |
Any OTel-Instrumented Framework (Zero SDK Code)
Point any framework's OpenTelemetry exporter at SpyLLM:
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.spyllm.dev
export OTEL_EXPORTER_OTLP_HEADERS="X-API-Key=sk-your-key"This works with any framework that emits gen_ai.* semantic convention spans. SpyLLM automatically maps OTel GenAI attributes to its schema.
What Gets Captured
Every LLM call automatically records:
- Prompt — full message history sent to the model
- Response — the model's output
- Token count — input + output tokens
- Cost — estimated USD cost based on model pricing
- Latency — wall-clock time for the API call
- Tool calls — if the model invoked tools/functions
- Errors — failed calls with the exception message
- Trace ID / Span ID — every call gets topology IDs, even standalone ones
- Agent Topology — interactive DAG visualization in the dashboard
With agentSpan() you additionally get:
- Parent Span ID — builds the parent-child DAG across agents
- Agent Role — orchestrator, worker, planner, etc.
- Operation Name — invoke_agent, execute_tool, chat, create_agent
- Framework — crewai, autogen, langgraph, custom
Advanced Usage
Manual Tracing
import SpyLLM from "@spy-llm/sdk";
const client = new SpyLLM("sk-...", "https://api.spyllm.dev");
await client.trace({
agent_name: "my-agent",
prompt: "What is 2+2?",
response: "4",
token_count: 15,
cost_usd: 0.001,
});Disable Auto-instrumentation
import { init } from "@spy-llm/sdk";
init({ apiKey: "sk-...", instrument: false });Documentation
Changelog
See GitHub Releases for a full changelog.
License
Proprietary — Copyright SpyLLM. All rights reserved.
