npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ollama-chain

v1.0.9

Published

A TypeScript/JavaScript library for building composable, chainable, and transactional chat flows with [Ollama](https://ollama.com/) models. ollama-chain provides a fluent API to construct, manage, and execute chat conversations, including streaming and tr

Readme

ollama-chain

A TypeScript/JavaScript library for building composable, chainable, and transactional chat flows with Ollama models. ollama-chain provides a fluent API to construct, manage, and execute chat conversations, including streaming and transaction support. Based on Ollama Javascript Library.

Features

  • Fluent, chainable API for building chat prompts
  • System, user, and assistant message management
  • Streaming and non-streaming responses
  • Transaction support (begin, commit, rollback message history)
  • Customizable model, options, and response format
  • Easy integration with Ollama's API
  • Language support for multilingual conversations
  • Short response mode for concise answers
  • Logging capabilities for debugging and monitoring
  • TypeScript support with full type definitions

Installation

npm install ollama-chain

Usage

Basic Streaming Example

TypeScript

import OllamaChain from "ollama-chain";

const main = async () => {
    const ollamachain = OllamaChain();

    const response = await ollamachain()
        .model("gemma3:4b")
        .logger(true)                           // Enable logging
        .setLanguage("eng")                     // Set response language
        .shortResponse()                        // Enable short response mode
        .systemMessage("You are a helpful assistant.")
        .userMessage("What is the capital of France?")
        .stream({ temperature: 0.7, top_p: 0.9 });

    let responseText = "";
    for await (const chunk of response) {
        responseText += chunk.message?.content || "";
        console.log("responseText:", responseText);
    }
    console.log("Response finished.");
};

main();

Non-Streaming (Single Response)

const response = await ollamachain()
    .model("gemma3:4b")
    .systemMessage("You are a helpful assistant.")
    .userMessage("Tell me a joke.")
    .chat({ temperature: 0.7 });

console.log(response.message.content);

Transaction Example

const chain = ollamachain();
chain.trx();
chain.userMessage("First message");
// ... add more messages
chain.rollback();

CommonJS


const { OllamaChain } = require('ollama-chain')

const main = async () => {
    const ollamachain = OllamaChain();

    const response = await ollamachain()
        .model("gemma3:4b")
        .systemMessage("You are a helpful assistant.")
        .userMessage("What is the capital of France?")
        .stream({ temperature: 0.7, top_p: 0.9 });

    let responseText = "";
    for await (const chunk of response) {
        responseText += chunk.message?.content || "";
        console.log("responseText:", responseText);
    }
    console.log("Response finished.");
};

main();

Advanced Features

Language Support

ollamachain()
    .setLanguage("ukr")  // Set Ukrainian language
    .userMessage("Tell me about Ukraine");

Short Response Mode

ollamachain()
    .shortResponse()  // Enable concise responses
    .userMessage("What is quantum computing?");

Step-by-Step Reasoning

ollamachain()
    .stepByStep()  // Enable step-by-step problem solving
    .userMessage("How do I solve a quadratic equation?");

Thinking Mode

ollamachain()
    .thinking()  // Instruct the model to show its thought process before answering
    .userMessage("Why is the sky blue?");

Logging

ollamachain()
    .logger(true)  // Enable logging for debugging
    .userMessage("Debug this conversation");

Examples

Check out more examples in the examples directory:

  • chat.ts - Interactive chat examples
  • prompt-builder.ts - Advanced prompt construction
  • stream.ts - Streaming response handling
  • transaction.ts - Message history management
  • set-language.ts - Multilingual conversation examples

Tested with gemma3:4b and compatible with other Ollama models.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - see LICENSE file for details.

API Reference

Core Methods

  • .model(modelName: string) — Set the Ollama model to use
  • .chat(options?: object) — Get a single response (non-streaming)
  • .stream(options?: object) — Get a streaming response (async iterable)
  • .execute(query: ChatRequestStream | ChatRequestBase) — Execute a custom query object directly

Message Management

  • .systemMessage(message: string, overload?: boolean) — Add or update a system message. When overload is true, replaces the existing system message completely. When false (default), appends the new message to the existing system message with a newline separator
  • .userMessage(message: string) — Add a user message to the conversation
  • .assistantMessage(message: string) — Add an assistant message to the conversation
  • .getHistory() — Get the current message history array

Response Formatting

  • .setLanguage(language: string) — Set the response language (e.g., "eng", "ukr")
  • .detailedResponse() — Configure the model to provide detailed, comprehensive responses
  • .shortResponse() — Configure the model to provide concise, brief responses
  • .thinking() — Instructs the model to write its thought process and reasoning before answering the question
  • .stepByStep() — Instructs the model to break down the problem and solve it step by step
  • .format(format?: ResponseFormat) — Set custom response format parameters

Transaction Support

  • .trx() — Begin a transaction to track message history changes
  • .commit() — Save changes and end the current transaction
  • .rollback() — Revert message history to the state before the transaction started

Configuration & Debugging

  • .logger(isActive: boolean) — Enable or disable query logging for debugging
  • .keepAlive(param: string | number) — Set how long to keep the model loaded. Accept a number (seconds) or a duration string ("300ms", "1.5h", "2h45m")
  • .toQuery(options?: object) — Get the raw query object for inspection or custom execution

License

MIT