opik-vercel
v1.11.11
Published
Opik TypeScript and JavaScript SDK integration with Vercel AI SDK
Maintainers
Readme
Opik Vercel AI SDK Integration
Seamlessly integrate Opik observability with your Vercel AI SDK applications to trace, monitor, and debug your AI workflows.
Features
- 🔍 Comprehensive Tracing: Automatically trace AI SDK calls and completions
- 📊 Hierarchical Visualization: View your AI execution as a structured trace with parent-child relationships
- 📝 Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
- 🚨 Error Handling: Capture and visualize errors in your AI API interactions
- 🏷️ Custom Tagging: Add custom tags to organize and filter your traces
- 🔄 Streaming Support: Full support for streamed completions and chat responses
Installation
Node.js
npm install opik-vercel ai @ai-sdk/openai @opentelemetry/sdk-node @opentelemetry/auto-instrumentations-nodeNext.js
npm install opik-vercel @vercel/otel @opentelemetry/api-logs @opentelemetry/instrumentation @opentelemetry/sdk-logsRequirements
- Node.js ≥ 18
- Vercel AI SDK (
ai≥ 3.0.0) - Opik SDK (automatically installed as a peer dependency)
- OpenTelemetry packages (see installation commands above)
Configuration
Set your environment variables:
OPIK_API_KEY="<your-api-key>"
OPIK_URL_OVERRIDE="https://www.comet.com/opik/api" # Cloud version
OPIK_PROJECT_NAME="<custom-project-name>"
OPIK_WORKSPACE="<your-workspace>"
OPENAI_API_KEY="<your-openai-api-key>" # If using OpenAI modelsUsage
Node.js
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { OpikExporter } from "opik-vercel";
const sdk = new NodeSDK({
traceExporter: new OpikExporter(),
instrumentations: [getNodeAutoInstrumentations()],
});
sdk.start();
async function main() {
const result = await generateText({
model: openai("gpt-5-nano"),
maxTokens: 50,
prompt: "What is love?",
experimental_telemetry: OpikExporter.getSettings({
name: "opik-nodejs-example",
}),
});
console.log(result.text);
await sdk.shutdown(); // Flushes the trace to Opik
}
main().catch(console.error);Next.js
For Next.js applications, use the framework's built-in OpenTelemetry support:
// instrumentation.ts
import { registerOTel } from "@vercel/otel";
import { OpikExporter } from "opik-vercel";
export function register() {
registerOTel({
serviceName: "opik-vercel-ai-nextjs-example",
traceExporter: new OpikExporter(),
});
}Then use the AI SDK with telemetry enabled:
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
const result = await generateText({
model: openai("gpt-5-nano"),
prompt: "What is love?",
experimental_telemetry: { isEnabled: true },
});Advanced Configuration
Custom Tags and Metadata
You can add custom tags and metadata to all traces generated by the OpikExporter:
const exporter = new OpikExporter({
// Optional: add custom tags to all traces
tags: ["production", "gpt-5-nano"],
// Optional: add custom metadata to all traces
metadata: {
environment: "production",
version: "1.0.0",
team: "ai-team",
},
// Optional: associate traces with a conversation thread
threadId: "conversation-123",
});Tags are useful for filtering and grouping traces, while metadata adds additional context for debugging and analysis. The threadId parameter is useful for tracking multi-turn conversations or grouping related AI interactions.
Telemetry Settings
Use OpikExporter.getSettings() to configure telemetry for individual AI SDK calls:
const result = await generateText({
model: openai("gpt-5-nano"),
prompt: "Tell a joke",
experimental_telemetry: OpikExporter.getSettings({
name: "custom-trace-name",
// Optional: set threadId per request (overrides exporter-level threadId)
metadata: {
threadId: "conversation-456",
},
}),
});Or use the basic telemetry settings:
const result = await generateText({
model: openai("gpt-5-nano"),
prompt: "Tell a joke",
experimental_telemetry: { isEnabled: true },
});Viewing Traces
To view your traces:
- Sign in to your Comet account
- Navigate to the Opik section
- Select your project to view all traces
- Click on a specific trace to see the detailed execution flow
Debugging
To enable more verbose logging for troubleshooting:
OPIK_LOG_LEVEL=DEBUGLearn More
- Opik Vercel AI SDK Integration Guide
- Opik Documentation
- Vercel AI SDK Documentation
- Opik TypeScript SDK
License
Apache 2.0
