npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@markwylde/ailib

v1.6.1

Published

An AI library for calling OpenRouter in JavaScript

Readme

ailib

A lightweight AI client library for Node.js that provides a simple interface for working with AI models through OpenRouter and Ollama.

Features

  • Simple thread-based conversation management
  • Streaming responses with events
  • Tool calling support
  • TypeScript support with proper typing
  • Modern async/await API
  • Support for OpenRouter and Ollama providers
  • Model pricing and cost tracking
  • Configurable model options
  • Support for model reasoning output

Installation

npm install @markwylde/ailib

Usage

import { createThread, OpenRouter, Ollama } from '@markwylde/ailib';
import { z } from 'zod';

// Create a thread
const ai = createThread({
  // Choose your provider
  provider: OpenRouter, // or: Ollama
  model: 'anthropic/claude-3-sonnet',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
  ],
  tools: [{
    name: 'get-weather',
    description: 'Get the weather for a location',
    parameters: z.object({
      location: z.string(),
    }),
    handler: async ({ location }) => {
      return `The weather in ${location} is sunny.`;
    }
  }],
  // For OpenRouter pass your API key. For Ollama you can pass an empty string.
  apiKey: process.env.OPENROUTER_API_KEY || "",
  modelOptions: {
    temperature: 0.7,
    max_tokens: 1000,
    // Add other model options as needed
  },
});

// Add a message to the thread
ai.messages.add({ role: 'user', content: 'What is the weather in Tokyo?' });

// Generate a new message from the AI
const stream = ai.messages.generate();

// Listen for data coming through the stream as an event
stream.on('state', (state) => {
  console.log(state); // 'sent' | 'receiving' | 'completed' | 'failed'
});

stream.on('data', ([chunk, message]) => {
  console.log(chunk); // Stream text chunks as they arrive
  // Access token and cost information
  console.log(`Tokens used: ${message.tokens}`);
  console.log(`Cost: $${message.cost}`);
});

// Listen for reasoning output (if supported by the model)
stream.on('reasoning', ([reasoningChunk, message]) => {
  console.log('Reasoning:', reasoningChunk);
});

// Listen for the stream to end
stream.on('end', () => {
  console.log('Stream completed');
});

// Wait for the stream to complete (Promise interface)
await stream;

// Access all messages in the thread
console.log(ai.messages);

API

createThread(options)

Creates a new conversation thread.

Options

  • provider: The AI provider to use (e.g., OpenRouter)
  • model: The model to use (e.g., 'anthropic/claude-3-sonnet')
  • messages: Initial messages in the thread (optional)
  • tools: Tools available to the AI (optional)
  • apiKey: Your API key
  • modelOptions: Configuration options for the model (optional)

Model Options

The modelOptions object supports a wide range of parameters:

{
  temperature?: number;
  max_tokens?: number;
  seed?: number;
  top_p?: number;
  top_k?: number;
  frequency_penalty?: number;
  presence_penalty?: number;
  repetition_penalty?: number;
  min_p?: number;
  top_a?: number;
  reasoning?: {
    enabled?: boolean;
    include?: boolean;
    include_output?: boolean;
  };
  // Additional options
}

Returns

A Thread object with a messages array.

Message Methods

  • ai.messages.add(message): Add a message to the thread
  • ai.messages.remove(message): Remove a message from the thread
  • ai.messages.generate(): Generate a new AI message based on the thread

Stream Events

The stream returned by generate() emits the following events:

  • state: Emitted when the stream state changes ('sent' | 'receiving' | 'completed' | 'failed')
  • data: Emitted when new content is received, provides the text chunk and the full message
  • reasoning: Emitted when reasoning content is received (if supported by the model)
  • end: Emitted when the stream ends

Message Properties

Messages returned from generation include these additional properties:

  • tokens: Number of tokens used in the completion
  • cost: Cost of the completion in USD
  • totalTokens: Total tokens used (prompt + completion)
  • totalCost: Total cost of the interaction in USD
  • reasoning: Reasoning output from the model (if available)

Tool Calling

Tools allow the AI model to call functions that you define:

const ai = createThread({
  // ... other options
  tools: [{
    name: 'get-weather',
    description: 'Get the weather for a location',
    parameters: z.object({
      location: z.string(),
    }),
    handler: async ({ location }) => {
      // Call a weather API here
      return `The weather in ${location} is sunny.`;
    }
  }]
});

Example

See the examples directory for working examples.

Web Demo

A web-based chat interface is available in the webdemo directory. This React application demonstrates the library's features in a user-friendly interface:

  • Real-time streaming responses
  • Collapsible reasoning tokens display
  • Markdown rendering with syntax highlighting
  • API key management
  • Responsive design
  • Local storage for chat history

To run the web demo:

cd webdemo
npm install
npm run dev

Then open your browser to http://localhost:5173

Requirements

  • Node.js 18 or higher
  • Either an OpenRouter API key, or a running Ollama server (defaults to http://localhost:11434 or set OLLAMA_HOST)

Development

# Install dependencies
npm install

# Build the library
npm run build

# Run the example
npm run example:weather