npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

llmjs2

v1.1.1

Published

Abstract layer for LLM completion supporting multiple providers

Readme

llmjs2

A lightweight llm Node.js library for building simple / personal AI applications

Supported Providers

  • Ollama - Connect to Ollama's cloud API
  • OpenRouter - Access multiple LLM models through OpenRouter

Installation

npm install llmjs2

Usage

llmjs2 supports three calling conventions:

Simple API (Auto-Detection)

import { completion } from 'llmjs2';

// Just provide a prompt - the library handles the rest
const result = await completion('Explain the use of llmjs2');

// Or provide a model and prompt
const result = await completion('ollama/minimax-m2.5:cloud', 'Explain the use of llmjs2');

How it works:

  • Looks for OLLAMA_API_KEY and OPEN_ROUTER_API_KEY environment variables
  • If only one is set, uses that provider
  • If both are set, randomly chooses one
  • Uses OLLAMA_DEFAULT_MODEL or defaults to minimax-m2.5:cloud for Ollama
  • Uses OPEN_ROUTER_DEFAULT_MODEL or defaults to openrouter/free for OpenRouter
  • If a model is provided, uses that model instead of the default

Function-Based API

import { completion } from 'llmjs2';

// Using Ollama
const resultOllama = await completion('ollama/minimax-m2.5:cloud', 'Explain the use of llmjs2', 'your-api-key');

// Using OpenRouter
const resultOR = await completion('openrouter/openrouter/free', 'Explain the use of llmjs2', 'your-api-key');

Object-Based API

import { completion } from 'llmjs2';

// Using Ollama with system message
const resultOllama = await completion({
  model: 'ollama/minimax-m2.5:cloud',
  messages: [
    { role: 'system', content: 'You are a helpful AI assistant.' },
    { role: 'user', content: 'Explain the use of llmjs2.' }
  ],
  apiKey: 'your-api-key' // optional
});

// Using OpenRouter with system message
const resultOR = await completion({
  model: 'openrouter/openrouter/free',
  messages: [
    { role: 'system', content: 'You are a helpful AI assistant.' },
    { role: 'user', content: 'Explain the use of llmjs2.' }
  ],
  apiKey: 'your-api-key' // optional
});

Tools Support

llmjs2 supports function calling (tools) through the object-based API:

import { completion } from 'llmjs2';

const result = await completion({
  model: 'openrouter/openrouter/free',
  messages: [
    { role: 'user', content: 'What is the weather like in Paris?' }
  ],
  tools: [
    {
      type: 'function',
      function: {
        name: 'get_weather',
        description: 'Get the current weather in a given location',
        parameters: {
          type: 'object',
          properties: {
            location: {
              type: 'string',
              description: 'The city and state, e.g. San Francisco, CA'
            },
            unit: {
              type: 'string',
              enum: ['celsius', 'fahrenheit'],
              description: 'The temperature unit to use'
            }
          },
          required: ['location']
        }
      }
    }
  ]
});

// Result when tools are used:
// {
//   content: '',
//   tool_calls: [
//     {
//       id: 'call_123',
//       type: 'function',
//       function: {
//         name: 'get_weather',
//         arguments: '{"location": "Paris, France"}'
//       }
//     }
//   ]
// }

API Key Configuration

You can provide API keys in four ways:

1. Simple API (Environment Variables)

export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key

# Optional: Set default models
export OLLAMA_DEFAULT_MODEL=minimax-m2.5:cloud
export OPEN_ROUTER_DEFAULT_MODEL=openrouter/free
const result = await completion('Your prompt');

2. Direct Parameter (Function API)

const result = await completion('ollama/minimax-m2.5:cloud', 'Your prompt', 'your-api-key');

3. Object Property (Object API)

const result = await completion({
  model: 'ollama/minimax-m2.5:cloud',
  messages: [{ role: 'user', content: 'Your prompt' }],
  apiKey: 'your-api-key'
});

4. Environment Variables (Function/Object API)

export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key
// Function API
const result = await completion('ollama/minimax-m2.5:cloud', 'Your prompt');

// Object API
const result = await completion({
  model: 'ollama/minimax-m2.5:cloud',
  messages: [{ role: 'user', content: 'Your prompt' }]
});

Model Format

Models must be specified in the format: provider/model_name

The provider is the text before the first /, and the model name is everything after it.

Examples:

  • ollama/minimax-m2.5:cloud
  • ollama/llama2
  • openrouter/openrouter/free
  • openrouter/meta-llama/llama-2-70b-chat

Messages Format (Object API)

The messages parameter is an array of message objects with the following structure:

[
  { role: 'system', content: 'You are a helpful AI assistant.' },
  { role: 'user', content: 'What is the capital of France?' },
  { role: 'assistant', content: 'The capital of France is Paris.' },
  { role: 'user', content: 'What is its population?' }
]

Supported roles:

  • system - System instructions
  • user - User messages
  • assistant - Assistant responses

Tools Format (Object API)

The tools parameter is an array of tool definitions:

[
  {
    type: 'function',
    function: {
      name: 'function_name',
      description: 'Description of what the function does',
      parameters: {
        type: 'object',
        properties: {
          param1: {
            type: 'string',
            description: 'Description of parameter'
          }
        },
        required: ['param1']
      }
    }
  }
]

Error Handling

The library throws descriptive errors for:

  • Missing or invalid parameters
  • Missing API keys
  • API request failures
  • Invalid response formats
  • Request timeouts (60 seconds)
  • Invalid tools format
try {
  const result = await completion('Your prompt');
} catch (error) {
  console.error('Completion failed:', error.message);
}

Example Programs

Main Example

A real usage test program is included in example.js. To run it:

# Set your API keys
export OLLAMA_API_KEY=your-ollama-api-key
export OPEN_ROUTER_API_KEY=your-openrouter-api-key

# Run the example
node example.js

The example program will:

  • Test simple API (auto-detection)
  • Test simple API with model
  • Test Ollama with function-based API
  • Test Ollama with object-based API
  • Test Ollama with tools
  • Test OpenRouter with function-based API
  • Test OpenRouter with object-based API
  • Test OpenRouter with tools
  • Display results and test summary

API Reference

completion(prompt)

Simple API (Prompt Only)

Parameters:

  • prompt (string): The prompt to send to the LLM

Returns:

  • Promise<string>: The completion result

Behavior:

  • Auto-detects provider based on available API keys
  • Uses OLLAMA_DEFAULT_MODEL or defaults to minimax-m2.5:cloud for Ollama
  • Uses OPEN_ROUTER_DEFAULT_MODEL or defaults to openrouter/free for OpenRouter
  • Randomly chooses provider if both API keys are set

completion(model, prompt)

Simple API (Model and Prompt)

Parameters:

  • model (string): Model identifier in format "provider/model_name"
  • prompt (string): The prompt to send to the LLM

Returns:

  • Promise<string>: The completion result

Behavior:

  • Auto-detects provider based on available API keys
  • Uses the provided model instead of the default
  • Randomly chooses provider if both API keys are set

completion(model, prompt, apiKey)

Function-Based API

Parameters:

  • model (string): Model identifier in format "provider/model_name"
  • prompt (string): The prompt to send to the LLM
  • apiKey (string, optional): API key (falls back to environment variables)

Returns:

  • Promise<string>: The completion result

completion(options)

Object-Based API

Parameters:

  • options (object): Configuration object
    • model (string): Model identifier in format "provider/model_name"
    • messages (array): Array of message objects with role and content
    • apiKey (string, optional): API key (falls back to environment variables)
    • tools (array, optional): Array of tool definitions

Returns:

  • Promise<string|object>: The completion result (string or object with tool calls)

Throws:

  • Error if model format is invalid
  • Error if prompt/messages is missing
  • Error if API key is not provided
  • Error if API request fails
  • Error if request times out (60 seconds)
  • Error if tools format is invalid

License

MIT