@mlflow/openai
v0.2.0
Published
OpenAI integration package for MLflow Tracing
Maintainers
Readme
MLflow Typescript SDK - OpenAI
Seamlessly integrate MLflow Tracing with OpenAI to automatically trace your OpenAI API calls.
| Package | NPM | Description |
| -------------------- | --------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------- |
| @mlflow/openai | | Auto-instrumentation integration for OpenAI. |
Installation
npm install @mlflow/openaiThe package includes the @mlflow/core package and openai package as peer dependencies. Depending on your package manager, you may need to install these two packages separately.
Quickstart
Start MLflow Tracking Server. If you have a local Python environment, you can run the following command:
pip install mlflow
mlflow server --backend-store-uri sqlite:///mlruns.db --port 5000If you don't have Python environment locally, MLflow also supports Docker deployment or managed services. See Self-Hosting Guide for getting started.
Instantiate MLflow SDK in your application:
import * as mlflow from '@mlflow/core';
mlflow.init({
trackingUri: 'http://localhost:5000',
experimentId: '<experiment-id>',
});Create a trace:
import { OpenAI } from 'openai';
import { tracedOpenAI } from '@mlflow/openai';
// Wrap the OpenAI client with the tracedOpenAI function
const client = tracedOpenAI(new OpenAI());
// Invoke the client as usual
const response = await client.chat.completions.create({
model: 'o4-mini',
messages: [
{ role: 'system', content: 'You are a helpful weather assistant.' },
{ role: 'user', content: "What's the weather like in Seattle?" },
],
});View traces in MLflow UI:

Documentation 📘
Official documentation for MLflow Typescript SDK can be found here.
License
This project is licensed under the Apache License 2.0.
