npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@launchdarkly/server-sdk-ai

v0.15.2

Published

LaunchDarkly AI SDK for Server-Side JavaScript

Readme

LaunchDarkly AI SDK for Server-Side JavaScript

NPM Actions Status Documentation NPM NPM

⛔️⛔️⛔️⛔️

[!CAUTION] This library is a alpha version and should not be considered ready for production use while this message is visible.

☝️☝️☝️☝️☝️☝️

LaunchDarkly overview

LaunchDarkly is a feature management platform that serves over 100 billion feature flags daily to help teams build better software, faster. Get started using LaunchDarkly today!

Twitter Follow

Quick Setup

This assumes that you have already installed the LaunchDarkly Node.js (server-side) SDK, or a compatible edge SDK.

  1. Install this package with npm or yarn:
npm install @launchdarkly/server-sdk-ai --save
# or
yarn add @launchdarkly/server-sdk-ai
  1. Create an AI SDK instance:
// The ldClient instance should be created based on the instructions in the relevant SDK.
const aiClient = initAi(ldClient);

Setting Default AI Configurations

When retrieving AI configurations, you need to provide default values that will be used if the configuration is not available from LaunchDarkly:

Fully Configured Default

const defaultConfig = {
  enabled: true,
  model: { 
    name: 'gpt-4',
    parameters: { temperature: 0.7, maxTokens: 1000 }
  },
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' }
  ]
};

Disabled Default

const defaultConfig = {
  enabled: false
};

Retrieving AI Configurations

The config method retrieves AI configurations from LaunchDarkly with support for dynamic variables and fallback values:

const aiConfig = await aiClient.config(
  aiConfigKey,
  context,
  defaultConfig,
  { myVariable: 'My User Defined Variable' } // Variables for template interpolation
);

// Ensure configuration is enabled
if (aiConfig.enabled) {
  const { messages, model, tracker } = aiConfig;
  // Use with your AI provider
}

TrackedChat for Conversational AI

TrackedChat provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:

  • Automatically configures models based on AI configuration
  • Maintains conversation history across multiple interactions
  • Automatically tracks token usage, latency, and success rates
  • Works with any supported AI provider (see AI Providers for available packages)

Using TrackedChat

// Use the same defaultConfig from the retrieval section above
const chat = await aiClient.createChat(
  'customer-support-chat',
  context,
  defaultConfig,
  { customerName: 'John' }
);

if (chat) {
  // Simple conversation flow - metrics are automatically tracked by invoke()
  const response1 = await chat.invoke('I need help with my order');
  console.log(response1.message.content);
  
  const response2 = await chat.invoke("What's the status?");
  console.log(response2.message.content);
  
  // Access conversation history
  const messages = chat.getMessages();
  console.log(`Conversation has ${messages.length} messages`);
}

Advanced Usage with Providers

For more control, you can use the configuration directly with AI providers. We recommend using LaunchDarkly AI Provider packages when available:

Using AI Provider Packages

import { LangChainProvider } from '@launchdarkly/server-sdk-ai-langchain';

const aiConfig = await aiClient.config(aiConfigKey, context, defaultValue);

// Create LangChain model from configuration
const llm = await LangChainProvider.createLangChainModel(aiConfig);

// Use with tracking
const response = await aiConfig.tracker.trackMetricsOf(
  LangChainProvider.getAIMetricsFromResponse,
  () => llm.invoke(messages)
);

console.log('AI Response:', response.content);

Using Custom Providers

import { LDAIMetrics } from '@launchdarkly/server-sdk-ai';

const aiConfig = await aiClient.config(aiConfigKey, context, defaultValue);

// Define custom metrics mapping for your provider
const mapCustomProviderMetrics = (response: any): LDAIMetrics => ({
  success: true,
  usage: {
    total: response.usage?.total_tokens || 0,
    input: response.usage?.prompt_tokens || 0,
    output: response.usage?.completion_tokens || 0,
  }
});

// Use with custom provider and tracking
const result = await aiConfig.tracker.trackMetricsOf(
  mapCustomProviderMetrics,
  () => customProvider.generate({
    messages: aiConfig.messages || [],
    model: aiConfig.model?.name || 'custom-model',
    temperature: aiConfig.model?.parameters?.temperature ?? 0.5,
  })
);

console.log('AI Response:', result.content);

Contributing

We encourage pull requests and other contributions from the community. Check out our contributing guidelines for instructions on how to contribute to this SDK.

About LaunchDarkly

  • LaunchDarkly is a continuous delivery platform that provides feature flags as a service and allows developers to iterate quickly and safely. We allow you to easily flag your features and manage them from the LaunchDarkly dashboard. With LaunchDarkly, you can:
    • Roll out a new feature to a subset of your users (like a group of users who opt-in to a beta tester group), gathering feedback and bug reports from real-world use cases.
    • Gradually roll out a feature to an increasing percentage of users, and track the effect that the feature has on key metrics (for instance, how likely is a user to complete a purchase if they have feature A versus feature B?).
    • Turn off a feature that you realize is causing performance problems in production, without needing to re-deploy, or even restart the application with a changed configuration file.
    • Grant access to certain features based on user attributes, like payment plan (eg: users on the ‘gold’ plan get access to more features than users in the ‘silver’ plan).
    • Disable parts of your application to facilitate maintenance, without taking everything offline.
  • LaunchDarkly provides feature flag SDKs for a wide variety of languages and technologies. Check out our documentation for a complete list.
  • Explore LaunchDarkly