opik-openai
v1.9.40
Published
Opik TypeScript and JavaScript SDK integration with OpenAI
Maintainers
Readme
Opik OpenAI Integration
Seamlessly integrate Opik observability with your OpenAI applications to trace, monitor, and debug your LLM API calls.
Features
- 🔍 Comprehensive Tracing: Automatically trace OpenAI API calls and completions
- 📊 Hierarchical Visualization: View your OpenAI execution as a structured trace with parent-child relationships
- 📝 Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
- 🚨 Error Handling: Capture and visualize errors in your OpenAI API interactions
- 🏷️ Custom Tagging: Add custom tags to organize and filter your traces
- 🔄 Streaming Support: Full support for streamed completions and chat responses
Installation
# npm
npm install opik-openai
# yarn
yarn add opik-openai
# pnpm
pnpm add opik-openaiRequirements
- Node.js ≥ 18
- OpenAI SDK (
openai≥ 4.0.0) - Opik SDK (automatically installed as a dependency)
Usage
import OpenAI from "openai";
import { trackOpenAI } from "opik-openai";
// Initialize the OpenAI client
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
// Wrap the client with Opik tracking
const trackedOpenAI = trackOpenAI(openai, {
// Optional configuration
traceMetadata: {
tags: ["production", "my-app"],
},
});
// Use the tracked client just like the original
async function main() {
const completion = await trackedOpenAI.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Hello world" }],
});
console.log(completion.choices[0].message);
// Flush traces at the end of your application
await trackedOpenAI.flush();
}
main().catch(console.error);Viewing Traces
To view your traces:
- Sign in to your Comet account
- Navigate to the Opik section
- Select your project to view all traces
- Click on a specific trace to see the detailed execution flow
Learn More
License
Apache 2.0
