npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

fluidtools

v1.0.23

Published

AI-powered API multi tool agent with multi-provider support (OpenAI, Anthropic, Ollama)

Readme

FluidTools:

NPM Version NPM Downloads GitHub

AI-powered API multi-tool agent with multi-provider support (OpenAI, Anthropic, Ollama, Gemini, Nebius)

Available on NPM and GitHub

Overview

FluidTools is a powerful NPM package that transforms REST API collections (Postman 2.1 JSON format) into intelligent AI agent tools. Built with TypeScript, it integrates seamlessly into any Node.js/TypeScript server, enabling you to quickly add AI agents that can interact with your APIs using natural language queries.

Key Features

  • 🚀 One-Click Tool Generation: Convert Postman collections to LangChain-compatible tools instantly
  • 🤖 Multi-Provider AI Support: Compatible with OpenAI, Anthropic, Ollama, Gemini, and Nebius
  • 🔧 LangGraph Integration: Robust agent orchestration with state management and memory
  • 📊 Semantic Tool Selection: Optional embedding-based tool filtering for large APIs
  • Human-in-Loop Security: Exact tool selection and user approval for sensitive operations
  • 🌍 Multi-Language Support: Babel integration for international chatbot deployment
  • 🌐 Server Agnostic: Integrates with any Express/Fastify/Koa server
  • TypeScript First: Full type safety with Zod schemas

Sponsors

Installation

npm install fluidtools

Quick Start

1. Convert Postman Collection to Tools

npx fluidtools ./api.json ./tools.ts

Or programmatically:

import { postmanToLangChainCode } from "fluidtools";

const collection = JSON.parse(fs.readFileSync("./api.json", "utf-8"));
const code = postmanToLangChainCode(collection);
fs.writeFileSync("./tools.ts", code);

2. Create AI Agent Server

import express from "express";
import { FluidToolsClient, loadProviderConfigFromEnv } from "fluidtools";
import { generateTools } from "./tools.ts"; // Generated tools

const app = express();
app.use(express.json());

const providerConfig = loadProviderConfigFromEnv();
const fluidClient = new FluidToolsClient(
  providerConfig,
  generateTools,
  "You are a helpful API assistant.",
  10, // max tool calls
  true // debug mode
);

app.get("/", async (req, res) => {
  const { query } = req.query;
  const { authorization } = req.headers;

  const token = authorization?.split(" ")[1];
  const response = await fluidClient.query(query, token);

  res.send({ message: response });
});

app.listen(8000);

3. Query Your AI Agent

curl -X GET "http://localhost:8000/?query=Get user details and list their projects" \
  -H "Authorization: Bearer YOUR_TOKEN"

Architecture

System Architecture Diagram

graph TD
    A[Postman 2.1 JSON] --> B[CLI Tool<br/>fluidtools]
    B --> C[Tool Generation<br/>TypeScript + Zod Schemas]
    C --> D[FluidTools Client]

    D --> E[Optional Embedding Service<br/>Semantic Tool Selection]

    D --> F[System Prompt<br/>Custom Chatbots]
    F --> G[LangGraph Agent<br/>Orchestration & Memory]

    G --> H[Multi-Provider LLM Support]
    H --> I[Multiple Model Support]
    I --> J[Multi-Language Support<br/>Babel Integration]

    J --> K[Server Integration<br/>Express/Fastify/Koa]
    K --> L[API Exposed<br/>REST/WebSocket]

    subgraph "🔧 Tool Conversion Pipeline"
        A
        B
        C
    end

    subgraph "🤖 AI Agent Core"
        D
        F
        G
        H
        I
        J
    end

    subgraph "🌐 Integration Layer"
        K
        L
    end

    subgraph "⚡ Security & Control"
        M[Human-in-Loop<br/>Tool Confirmation]
        N[Exact Tool Selection<br/>Security Controls]
    end

    G --> M
    M --> N

    subgraph "Provider Ecosystem"
        O[OpenAI<br/>GPT-4, GPT-3.5]
        P[Anthropic<br/>Claude 3.5, Opus]
        Q[Ollama<br/>Local Models]
        R[Gemini<br/>2.5 Flash, Pro]
        S[Nebius<br/>Kimi-K2]
    end

    I --> O
    I --> P
    I --> Q
    I --> R
    I --> S

    L --> T[Chatbot UI<br/>Gradio/React/Web]

System Architecture Overview

  1. Postman Collection Processing

    • Parses Postman 2.1 JSON format
    • Extracts requests, parameters, bodies, and schemas
    • Generates TypeScript tools with automatic Zod validation
  2. Tool Generation Engine

    • Converts each API endpoint into a LangChain tool
    • Handles path variables, query parameters, headers
    • Supports all HTTP methods (GET, POST, PUT, DELETE, PATCH)
    • Auto-generates meaningful descriptions
  3. Multi-Provider LLM Integration

    • Unified interface for different AI providers
    • Configurable model selection and API keys
    • Consistent response formatting
  4. LangGraph Orchestration

    • Sequential tool execution with memory
    • State persistence using checkpointer
    • Built-in retry mechanisms and error handling
  5. Optional Embedding Layer

    • Semantic indexing of tool definitions
    • Cosine similarity-based tool selection
    • Reduces token usage for large toolsets
  6. Server Integration

    • Session-based conversation management
    • Tool call confirmation system
    • Rate limiting and authentication

Data Flow


Postman Collection JSON ──────┐
                              │
CLI Tool (fluidtools) ────────▼
                              │
TypeScript Tool Code ─────────▼
                              │
Express/Fastify Server ───────▼
                              │
FluidTools Client ────────────▼
                              │
LangGraph Agent ──────────────▼
                              │
LLM Provider + Tools ─────────▼
                              │
API Calls + Responses ────────▼
                              │
User-Friendly Chat Response ──▼

Demo 1: Gradio Integration (Public Testing)

Located in ./demo/server/, this demo provides a complete Express server with Gradio UI integration for testing your AI agents:

Features:

  • Web upload interface for Postman collections
  • Real-time chat with your AI agent
  • Provider selection (OpenAI, Anthropic, etc.)
  • Rate limiting for free tier testing
  • Tool confirmation dialogs
  • Session management

Backend Setup:

cd demo/server
npm install
npm start

Backend runs on http://localhost:3000

Frontend (Gradio UI):

cd demo/gradioServer
pip install -r requirements.txt
python app.py

Frontend runs on http://localhost:7860 - open this in your browser for the beautiful glassmorphic chat interface with drag-and-drop Postman collection upload and real-time AI chat.

Demo 2: Real-World Integration (Cloud API Example)

Located in ./demo2/backend/, this demo shows a production-ready integration with a cloud provider API:

Features:

  • Pre-generated tools from Ace Cloud API
  • Simplified server setup
  • Custom system prompts
  • Environment variable configuration
  • Tool approval workflows

This demo converts a comprehensive cloud API (instances, volumes, networks, billing, etc.) into AI tools.

Backend Setup:

cd demo2/backend
npm install
npm run dev

Backend runs on http://localhost:8000

Frontend (React App):

cd demo2/frontend
npm install
npm run dev

Frontend runs on http://localhost:5173 - features a modern React chat interface with:

  • 🎤 Voice input/output capabilities (STT/TTS)
  • 📱 Responsive design with markdown rendering
  • ✅ Tool approval dialogs for sensitive operations
  • 🔄 Real-time message streaming
  • 🎨 Beautiful UI with copy/retry functionality
  • 🔧 Advanced chatbot features

The React app connects to the backend API to provide a complete user experience for interacting with your AI agent.

API Reference

FluidToolsClient

Main class for managing AI agents.

new FluidToolsClient(
  providerConfig: ProviderConfig,
  toolsGenerator: Function,
  systemInstructions?: string,
  maxToolCalls?: number,
  debug?: boolean,
  expireAfterSeconds?: number,
  confirmationConfig?: ToolConfirmationConfig,
  toolsConfig?: Record<string, any>,
  embeddingConfig?: EmbeddingConfig
)

Key Methods

  • query(query: string, accessToken?: string): Execute natural language query
  • clearThread(accessToken?: string): Clear conversation memory
  • getPendingConfirmations(accessToken?: string): Check pending tool approvals
  • approveToolCall(toolCallId: string, accessToken?: string): Approve pending tool
  • rejectToolCall(toolCallId: string, accessToken?: string): Reject pending tool

Provider Configuration

// Environment Variables
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
OLLAMA_BASE_URL=http://localhost:11434

// Or programmatic
const config = {
  provider: "openai",
  model: "gpt-4",
  apiKey: process.env.OPENAI_API_KEY,
  temperature: 0.1
};

CLI Usage

Generate tools from Postman collection:

fluidtools <input-file> [output-file] [--help]

# Examples
fluidtools api.json tools.ts
fluidtools ./collections/my-api.json

Contributing

  1. Fork the repository
  2. Create feature branch: git checkout -b feature/amazing-feature
  3. Commit changes: git commit -m 'Add amazing feature'
  4. Push to branch: git push origin feature/amazing-feature
  5. Open a Pull Request

License

ISC

Contributors

We'd like to thank all the amazing people who have contributed to FluidTools! 👥

Support


Built with ❤️ for developers who want AI-powered API interactions