npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@datastax/langflow-client

v0.4.1

Published

A JavaScript client for the Langflow API

Readme

JavaScript client for the Langflow API

NPM Version Tests


This package provides an easy way to use the Langflow API to run flows from within server-side JavaScript applications.

Installation

With npm

npm install @datastax/langflow-client

With yarn

yarn add @datastax/langflow-client

With pnpm

pnpm add @datastax/langflow-client

Prerequisites

To use this Langflow client you will need a DataStax account with which you can use DataStax Langflow. You will need a flow before you can call its API. If you don't already have one, you can get started with the Basic Prompting Flow.

You can also use this with an installation of open-source Langflow.

Configuration

DataStax Langflow

You will need a Langflow API key. You provide the API key when you create a new client. You will also need your Langflow ID.

You can generate API keys and find your Langflow ID on the API modal in the Langflow flow editor.

Open-source Langflow

You will need an installation of Langflow. You may also require an API key if you have set up authentication on Langflow.

Usage

You can import the client like so:

import { LangflowClient } from "@datastax/langflow-client";

Initialization

You can then create a new client object with the following options:

  • baseURL: This is set to https://api.langflow.astra.datastax.com by default, if you are running your own instance of Langflow you will need to provide the URL
  • langflowId: The ID of your organisation, which can be found in the API modal of the flow editor. This is not required for your own instance of Langflow
  • apiKey: A Langflow API key that can be generated within your DataStax account or in the settings of open-source Langflow
// using DataStax Langflow, you do not need to provide the baseUrl
const dsLangflowClient = new LangflowClient({ langflowId, apiKey });

// for open-source langflow, you will need to provide a baseUrl and optionally an apiKey
const baseUrl = "http://localhost:7860";
const apiKey = "sk-...";
const osLangflowClient = new LangflowClient({ baseUrl, apiKey });

Running a flow

Langflow documentation for running a flow.

Once you have a client, you can create a reference to a flow using the flowID. This can be found in the API modal in Langflow.

const flow = client.flow(flowId);

You can run a flow by calling run with the text input to the flow:

const response = await client.flow(flowId).run(input);

You can add tweaks to a flow like so:

const response = await client
  .flow(flowId)
  .tweak(tweakName, tweakOptions)
  .run(input);

Or you can pass all tweaks as an object:

const response = await client.flow(flowId).run(input, { tweaks });

You can also pass input and output options, as well as a session ID.

import { InputTypes, OutputTypes } from "@datastax/langflow-client/consts";

const response = await client.flow(flowId).run(input, {
  input_type: InputTypes.CHAT,
  output_type: OutputTypes.CHAT,
  session_id,
  tweaks,
});

The available input types are "chat", "text" and "any". The available output types are "chat", "text", "any" and "debug". The default for both is "chat".

Flow reponses

Langflow is very flexible in its output. So the FlowResponse object gives you raw access to the sessionId and the outputs.

const response = await client.flow(flowId).run(input);
console.log(response.outputs);

There is one convenience function that will return you the first chat output message text. If you only have one chat output component in your flow, this is a useful shortcut to get to that response.

const response = await client.flow(flowId).run(input);
console.log(response.chatOutputText());

Streaming

The Langflow API supports streaming responses. Instead of calling run on a Flow object, you can call stream with the same arguments and the response will be a ReadableStream of objects.

const response = await client.flow(flowId).stream(input);

for await (const event of response) {
  console.log(event);
}

There are three different events: add_message, token, and end. The events mean:

  • add_message: a message being added to the chat and can refer to a human input message or a response from an AI
  • token: a token that is emitted as part of a message being generated by the flow
  • end: all tokens have been returned, this message will also contain a full FlowResponse

Event objects have the format:

{
  event: "add_message" | "token" | "end",
  data: object
}

The event.data is different per event type. The token event type is the simplest and looks like this:

{
  "event": "token",
  "data": {
    "chunk": "hello ",
    "id": "6686ff20-0c95-40bb-8879-fd90ed3d634e",
    "timestamp": "2025-02-12 22:18:04 UTC"
  }
}

There's more documentation and examples of a streaming response in the Langflow docs.

File upload

Images

The Langflow v1 file upload API supports uploading image files to a flow, that can then be used in that flow.

Chat input components support files as input as well as text. You need to upload your file first, using the file upload function, then provide the file path to the flow as a tweak.

const buffer = await readFile(path);
const file = new File([buffer], "image.jpg", { type: "image/jpeg" });

const flow = client.flow(flowId)
const file = await flow.uploadFile(file);
console.log(file);
// => { flowId: "XXX", filePath: "YYY" }

const response = await flow
  .tweak("ChatInput-abcd": { files: file.filePath })
  .run("What can you see in this image?");

[!WARNING]
DataStax Langflow doesn't make file upload available, you will receive a 501 Not Implemented error.

File uploads

The Langflow v2 file upload API supports uploading files to a user. These can then be used with the File component.

[!WARNING] The v2 file upload and the File component don't support image uploads. For images you should use the v1 file upload API.

You can upload files like this:

const buffer = await readFile(path);
const file = new File([buffer], "document.pdf", { type: "application/pdf" });

const fileUpload = await client.files.upload(file);
console.log(file);
// => { path: "abc123/document.pdf", name: "document", ... }

You can then send them to the file component in a flow using a tweak.

const flow = client.flow(flowId);
flow.tweak("File-abc123", {
  path: fileUpload.path,
});

You can also list your uploaded files with the files.list() function:

const files = await client.files.list();

console.log(files);
// [{ path: "...", }, ...]

[!NOTE] TODO: other file methods available through the API: download, edit, delete, and delete all

Logs

Langflow documentation for the logs API.

Fetching the logs

You can fetch the logs for the your Langflow instance.

const logs = await client.logs.fetch();

When fetching the logs, you can pass a timestamp and either a number of lines_before or lines_after the timestamp. For example, the following code will get the 10 log lines that happened after 1 hour ago:

const logs = await client.logs.fetch({
  timestamp: Date.now() - 60 * 60 * 1000,
  lines_after: 10,
});

Streaming the logs

You can also stream the logs by requesting the streaming endpoint.

for await (const log of await client.logs.stream()) {
  console.log(log);
}

Aborting requests

You can use the standard AbortController to cancel requests by passing a signal to the run or uploadFile functions. The functions will reject with a DOMException error with the name AbortError or, if you use AbortSignal.timeout, TimeoutError.

For example, when running the following code, if the entire request takes longer than 500ms, then the promise will reject and the error message will be, "The operation was aborted due to timeout".

const signal = AbortSignal.timeout(500);
try {
  const response = await client.flow(flowId).run(input, { signal });
} catch (error) {
  console.error(error.message);
}

Contributing

To run and contribute to this library you can clone the repository:

git clone [email protected]:datastax/langflow-client-ts.git
cd langflow-client-ts

Install the dependencies:

npm install

Run the tests:

npm test

Transpile the TypeScript to JavaScript:

npm run build

Transpile the TypeScript to JavaScript in watch mode:

npm run build:watch

Lint the code:

npm run lint

Format the code:

npm run format

Check the formatting of the code:

npm run format:check