@uselemma/tracing
v2.7.0
Published
OpenTelemetry-based tracing module for Lemma
Downloads
921
Maintainers
Readme
@uselemma/tracing
OpenTelemetry-based tracing for AI agents. Capture inputs, outputs, timing, token usage, and errors — then view everything in Lemma.
Installation
npm install @uselemma/tracingQuick Start
1. Register the tracer provider
Call registerOTel once when your application starts. It reads LEMMA_API_KEY and LEMMA_PROJECT_ID from environment variables by default.
import { registerOTel } from "@uselemma/tracing";
registerOTel();You can also enable experiment mode globally for the process:
import { enableExperimentMode } from "@uselemma/tracing";
enableExperimentMode();2. Wrap your agent
wrapAgent creates a root OpenTelemetry span named ai.agent.run and records:
ai.agent.namelemma.run_idai.agent.inputlemma.is_experiment
import { wrapAgent } from "@uselemma/tracing";
const wrappedFn = wrapAgent(
"my-agent",
async ({ onComplete }) => {
const result = await doWork(userMessage);
onComplete(result);
return result;
},
{ autoEndRoot: true },
);
const { result, runId } = await wrappedFn();Export Behavior
- Spans are exported in run-specific batches keyed by
lemma.run_id. - A run batch is exported when its top-level
ai.agent.runspan ends. forceFlush()exports remaining runs in separate batches per run.- Spans with
instrumentationScope.name === "next.js"are excluded from export.
Environment Variables
| Variable | Description |
| ------------------ | --------------------- |
| LEMMA_API_KEY | Your Lemma API key |
| LEMMA_PROJECT_ID | Your Lemma project ID |
Both are required unless passed explicitly to registerOTel().
Documentation
- Tracing Overview — concepts, API reference, and usage patterns
- Vercel AI SDK Integration — framework setup, streaming, and examples
License
MIT
