langfuse-xsai
v0.0.1
Published
Langfuse integration for xsai SDK
Downloads
94
Readme
langfuse-xsai
Langfuse integration for xsai.
English
WIP: This project was generated by AI after analyzing Langfuse, OpenAI, and XSAI. While self-testing confirms that data tracing works, absolute stability is not guaranteed and bugs may exist. Currently, only methods within XSAI are traced. Support for tracing agents built using XSAI is still under discussion and development.
Installation
npm install xsai langfuse-xsai @opentelemetry/sdk-node @langfuse/otelConfiguration
Configure your environment variables in .env:
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_BASE_URL="https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL="https://us.cloud.langfuse.com" # 🇺🇸 US region
# LANGFUSE_BASE_URL="http://localhost:3000" # Self-hostedSDK Initialization
Initialize the OpenTelemetry SDK with Langfuse:
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();Usage
Wrap your xsai instance or functions with observeXsai:
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";
// Wrap the entire SDK
const observedXsai = observeXsai(xsai, {
generationName: "Traced generation",
generationMetadata: { someMetadataKey: "someValue" },
sessionId: "session-id",
userId: "user-id",
tags: ["tag1", "tag2"],
});
// Use as normal
const { text } = await observedXsai.generateText({
apiKey: process.env.OPENAI_API_KEY!,
baseURL: "https://open.bigmodel.cn/api/paas/v4/",
messages: [{ role: "system", content: "Tell me a story about a dog." }],
model: "gpt-4o",
});Custom Trace Properties
You can customize the trace properties by passing an options object to observeXsai:
const observedXsai = observeXsai(xsai, {
generationName: "Traced generation", // Custom name for the generation
generationMetadata: { someMetadataKey: "someValue" }, // Custom metadata
sessionId: "session-id", // Session ID
userId: "user-id", // User ID
tags: ["tag1", "tag2"], // Custom tags
});Note for CLI / Short-lived processes
If you are running this in a CLI or a short-lived process, you must manually flush the traces before the process exits:
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";
const processor = new LangfuseSpanProcessor();
const sdk = new NodeSDK({
spanProcessors: [processor],
});
sdk.start();
const observedXsai = observeXsai(xsai);
await observedXsai.generateText({
// ... options
});
// Flush traces before exit
await processor.forceFlush();中文
WIP: 本项目系 AI 在分析 Langfuse、OpenAI 和 XSAI 后生成。虽经自测确认数据追踪(Trace)功能正常,但无法保证绝对稳定,且可能存在 Bug。目前仅支持追踪 XSAI 内部的方法,对于追踪使用 XSAI 构建的 Agent,尚处于讨论和开发阶段。
安装
npm install xsai langfuse-xsai @opentelemetry/sdk-node @langfuse/otel配置
在 .env 文件中配置环境变量:
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_BASE_URL="https://cloud.langfuse.com" # 🇪🇺 欧洲区域
# LANGFUSE_BASE_URL="https://us.cloud.langfuse.com" # 🇺🇸 美国区域
# LANGFUSE_BASE_URL="http://localhost:3000" # 自托管SDK 初始化
使用 Langfuse 初始化 OpenTelemetry SDK:
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();使用方法
使用 observeXsai 包装你的 xsai 实例或函数:
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";
// 包装整个 SDK
const observedXsai = observeXsai(xsai, {
generationName: "Traced generation",
generationMetadata: { someMetadataKey: "someValue" },
sessionId: "session-id",
userId: "user-id",
tags: ["tag1", "tag2"],
});
// 正常使用
const { text } = await observedXsai.generateText({
apiKey: process.env.OPENAI_API_KEY!,
baseURL: "https://open.bigmodel.cn/api/paas/v4/",
messages: [{ role: "system", content: "Tell me a story about a dog." }],
model: "gpt-4o",
});自定义追踪属性
你可以通过向 observeXsai 传递选项对象来自定义追踪属性:
const observedXsai = observeXsai(xsai, {
generationName: "Traced generation", // 自定义生成名称
generationMetadata: { someMetadataKey: "someValue" }, // 自定义元数据
sessionId: "session-id", // 会话 ID
userId: "user-id", // 用户 ID
tags: ["tag1", "tag2"], // 自定义标签
});CLI / 短暂运行进程提示
如果你在 CLI 或短暂运行的进程中运行此代码,必须在进程退出前手动刷新 (flush) 链路追踪数据:
import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";
const processor = new LangfuseSpanProcessor();
const sdk = new NodeSDK({
spanProcessors: [processor],
});
sdk.start();
const observedXsai = observeXsai(xsai);
const { text } = await observedXsai.generateText({
// ... 选项
});
// 退出前刷新数据
await processor.forceFlush();