npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@sweetoburrito/backstage-plugin-ai-assistant-backend-module-callback-provider-langfuse

v0.2.3

Published

The callback-provider-langfuse backend module for the ai-assistant plugin.

Readme

@sweetoburrito/backstage-plugin-ai-assistant-backend-module-callback-provider-langfuse

A callback provider module that integrates Langfuse observability into the Backstage AI Assistant backend, enabling tracing, monitoring, and analytics of LLM interactions.

This README explains how the provider works, when to use it, configuration options, and how to wire it into your Backstage backend.

Features

  • Automatically trace all LLM calls, agent executions, and tool invocations through Langfuse.
  • Track token usage, costs, and performance metrics for each conversation.
  • Debug AI interactions with detailed execution traces in the Langfuse dashboard.
  • Monitor user behavior and conversation patterns across your organization.
  • Integrates with both Langfuse Cloud and self-hosted Langfuse instances.
  • Uses OpenTelemetry for comprehensive tracing alongside LangChain callbacks.

When to use

Use this module if you want to:

  • Monitor and debug your AI Assistant interactions in production
  • Track LLM costs and token usage across models and users
  • Analyze conversation patterns and user behavior
  • Maintain audit logs of AI interactions for compliance
  • Optimize prompts and model performance based on real usage data
  • Get visibility into agent decision-making and tool usage

Configuration

Add the provider configuration to your Backstage app-config.yaml or app-config.local.yaml under aiAssistant.callbacks.langfuse.

Minimum configuration keys (example):

aiAssistant:
  callbacks:
    langfuse:
      secretKey: ${LANGFUSE_SECRET_KEY}
      publicKey: ${LANGFUSE_PUBLIC_KEY}
      baseUrl: https://cloud.langfuse.com

Field descriptions

  • secretKey - Your Langfuse secret API key (starts with sk-lf-). Marked as secret in configuration.
  • publicKey - Your Langfuse public API key (starts with pk-lf-). Marked as secret in configuration.
  • baseUrl - The Langfuse instance URL (e.g., https://cloud.langfuse.com, https://us.cloud.langfuse.com, or your self-hosted URL)

The exact keys required depend on your Langfuse configuration. Check the provider's config.d.ts in the package for the canonical types used by the module.

Install

Install the module into your Backstage backend workspace:

yarn workspace backend add @sweetoburrito/backstage-plugin-ai-assistant-backend-module-callback-provider-langfuse

Wire the provider into your backend

Add the provider module import to your backend entrypoint (usually packages/backend/src/index.ts):

// packages/backend/src/index.ts

// other backend modules...
backend.add(import('@sweetoburrito/backstage-plugin-ai-assistant-backend'));

// Add the Langfuse callback provider
++backend.add(
++  import(
++    '@sweetoburrito/backstage-plugin-ai-assistant-backend-module-callback-provider-langfuse'
++  ),
++);

Restart your backend after adding the provider so it registers with the AI Assistant plugin.

What gets tracked

The Langfuse callback provider automatically captures:

  • Conversations: Full conversation history with user and assistant messages
  • Model calls: All LLM invocations including prompts, completions, and parameters
  • Tool usage: When agents use tools like search, catalog lookups, or custom tools
  • Token metrics: Input/output tokens for cost and usage tracking
  • Performance: Latency and execution time for each operation
  • Errors: Failed requests with error messages and stack traces
  • Metadata: User entity references, model IDs, session IDs, and custom tags

Verification

Once configured and running:

  1. Use the AI Assistant in Backstage to ask a question
  2. Log in to your Langfuse dashboard
  3. Navigate to the Traces section
  4. You should see traces appearing with tags like backstage-ai-assistant and chat

Each trace includes:

  • User queries and AI responses
  • Model calls with prompts and completions
  • Token usage and costs
  • Tool invocations
  • Performance metrics
  • User and session information

Additional Resources