npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

langfuse-xsai

v0.0.1

Published

Langfuse integration for xsai SDK

Downloads

94

Readme

langfuse-xsai

npm version License

Langfuse integration for xsai.

English

WIP: This project was generated by AI after analyzing Langfuse, OpenAI, and XSAI. While self-testing confirms that data tracing works, absolute stability is not guaranteed and bugs may exist. Currently, only methods within XSAI are traced. Support for tracing agents built using XSAI is still under discussion and development.

Installation

npm install xsai langfuse-xsai @opentelemetry/sdk-node @langfuse/otel

Configuration

Configure your environment variables in .env:

LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_BASE_URL="https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL="https://us.cloud.langfuse.com" # 🇺🇸 US region
# LANGFUSE_BASE_URL="http://localhost:3000" # Self-hosted

SDK Initialization

Initialize the OpenTelemetry SDK with Langfuse:

import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";

const sdk = new NodeSDK({
  spanProcessors: [new LangfuseSpanProcessor()],
});

sdk.start();

Usage

Wrap your xsai instance or functions with observeXsai:

import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";

// Wrap the entire SDK
const observedXsai = observeXsai(xsai, {
  generationName: "Traced generation",
  generationMetadata: { someMetadataKey: "someValue" },
  sessionId: "session-id",
  userId: "user-id",
  tags: ["tag1", "tag2"],
});

// Use as normal
const { text } = await observedXsai.generateText({
  apiKey: process.env.OPENAI_API_KEY!,
  baseURL: "https://open.bigmodel.cn/api/paas/v4/",
  messages: [{ role: "system", content: "Tell me a story about a dog." }],
  model: "gpt-4o",
});

Custom Trace Properties

You can customize the trace properties by passing an options object to observeXsai:

const observedXsai = observeXsai(xsai, {
  generationName: "Traced generation", // Custom name for the generation
  generationMetadata: { someMetadataKey: "someValue" }, // Custom metadata
  sessionId: "session-id", // Session ID
  userId: "user-id", // User ID
  tags: ["tag1", "tag2"], // Custom tags
});

Note for CLI / Short-lived processes

If you are running this in a CLI or a short-lived process, you must manually flush the traces before the process exits:

import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";

const processor = new LangfuseSpanProcessor();
const sdk = new NodeSDK({
  spanProcessors: [processor],
});
sdk.start();

const observedXsai = observeXsai(xsai);

await observedXsai.generateText({
  // ... options
});

// Flush traces before exit
await processor.forceFlush();

中文

WIP: 本项目系 AI 在分析 Langfuse、OpenAI 和 XSAI 后生成。虽经自测确认数据追踪(Trace)功能正常,但无法保证绝对稳定,且可能存在 Bug。目前仅支持追踪 XSAI 内部的方法,对于追踪使用 XSAI 构建的 Agent,尚处于讨论和开发阶段。

安装

npm install xsai langfuse-xsai @opentelemetry/sdk-node @langfuse/otel

配置

.env 文件中配置环境变量:

LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_BASE_URL="https://cloud.langfuse.com" # 🇪🇺 欧洲区域
# LANGFUSE_BASE_URL="https://us.cloud.langfuse.com" # 🇺🇸 美国区域
# LANGFUSE_BASE_URL="http://localhost:3000" # 自托管

SDK 初始化

使用 Langfuse 初始化 OpenTelemetry SDK:

import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";

const sdk = new NodeSDK({
  spanProcessors: [new LangfuseSpanProcessor()],
});

sdk.start();

使用方法

使用 observeXsai 包装你的 xsai 实例或函数:

import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";

// 包装整个 SDK
const observedXsai = observeXsai(xsai, {
  generationName: "Traced generation",
  generationMetadata: { someMetadataKey: "someValue" },
  sessionId: "session-id",
  userId: "user-id",
  tags: ["tag1", "tag2"],
});

// 正常使用
const { text } = await observedXsai.generateText({
  apiKey: process.env.OPENAI_API_KEY!,
  baseURL: "https://open.bigmodel.cn/api/paas/v4/",
  messages: [{ role: "system", content: "Tell me a story about a dog." }],
  model: "gpt-4o",
});

自定义追踪属性

你可以通过向 observeXsai 传递选项对象来自定义追踪属性:

const observedXsai = observeXsai(xsai, {
  generationName: "Traced generation", // 自定义生成名称
  generationMetadata: { someMetadataKey: "someValue" }, // 自定义元数据
  sessionId: "session-id", // 会话 ID
  userId: "user-id", // 用户 ID
  tags: ["tag1", "tag2"], // 自定义标签
});

CLI / 短暂运行进程提示

如果你在 CLI 或短暂运行的进程中运行此代码,必须在进程退出前手动刷新 (flush) 链路追踪数据:

import { NodeSDK } from "@opentelemetry/sdk-node";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { observeXsai } from "langfuse-xsai";
import * as xsai from "xsai";

const processor = new LangfuseSpanProcessor();
const sdk = new NodeSDK({
  spanProcessors: [processor],
});
sdk.start();

const observedXsai = observeXsai(xsai);

const { text } = await observedXsai.generateText({
  // ... 选项
});

// 退出前刷新数据
await processor.forceFlush();