npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ollama-kit

v1.0.0

Published

TypeScript-first library that wraps Ollama's HTTP API

Readme

[!WARNING] ⚠ Feature complete, but undergoing first-round testing and docs! Expect API tweaks before v1.1.0. Early adopters welcome — please share feedback and edge cases via issues.

ollama-kit

TypeScript-first library that wraps Ollama's HTTP API into a type-safe, feature-rich toolkit for Node.js and browser applications.

ollama-kit provides everything you need to build AI applications with Ollama:

  • Simple chat API — One-liner methods for text, streaming, and structured outputs
  • Zod-first validation — Full TypeScript type inference with optional JSON Schema fallback
  • Tool/function calling — Define tools once, use in manual or auto-execute modes with guardrails
  • JSON repair — Automatically fix malformed LLM JSON output (trailing commas, code fences, etc.)
  • Type safety — Works seamlessly with TypeScript's strict mode

Installation

npm install ollama-kit zod  # zod optional but recommended

Prerequisites: Ollama running locally (download here)

ollama run llama3.2  # or your preferred model

Quick Start

import { createClient, user } from 'ollama-kit';
import { z } from 'zod';

const client = createClient({ baseUrl: 'http://localhost:11434' });

// Simple chat
const { text } = await client.chat.text({
  model: 'llama3.2',
  messages: [user('Explain TypeScript in one sentence')]
});

// Structured output with type safety
const schema = z.object({
  summary: z.string(),
  topics: z.array(z.string())
});

const { data } = await client.chat.json({
  model: 'llama3.2',
  messages: [user('Summarize TypeScript')],
  schema  // Fully typed!
});

See more examples below ⬇️

Project Status

v1.0.0 — All Core Features Complete

  • Phase 1: Foundation (HTTP, streaming, chat)
  • Phase 2: Core Features (schemas, structured outputs, tool calling)
  • Phase 3: Batteries Included (embeddings, caching, templating, memory)
  • Phase 4: Production Ready (proxy helpers, React hooks, tokenizer)

37 features implemented | 320 tests passing | 92 KB bundle (main)

See the Roadmap for detailed progress.

Core Features

💬 Chat API

  • Text, streaming, and structured outputs with full TypeScript inference
  • Tool/function calling with manual and auto-execute modes
  • JSON repair for malformed LLM responses
  • Message helpers for system/user/assistant/tool messages

🔧 Advanced Features

  • Embeddings with batch processing and configurable pooling
  • Response caching with adapter interface (in-memory included)
  • Prompt templating with variable interpolation and template packs
  • Conversation memory with flexible retention policies
  • Model management for listing and loading models

🌐 Browser & React

  • Proxy handler for Express/Hono/Next.js with CORS, auth, rate limiting
  • React hooks (useChat, useStructuredOutput) via factory pattern
  • Browser client for communicating with proxy servers
  • Token estimation with adapter interface for custom tokenizers

📦 Package Exports

import { createClient } from 'ollama-kit';                    // Core library
import { createProxyHandler } from 'ollama-kit/node';          // Server proxy
import { BrowserClient, createUseChatHook } from 'ollama-kit/react';  // Browser/React

Examples

Streaming Responses

for await (const event of client.chat.stream({
  model: 'llama3.2',
  messages: [user('Tell me a story...')]
})) {
  if (event.type === 'token') {
    process.stdout.write(event.token);
  }
}

Tool Calling

import { defineTool } from 'ollama-kit';

const searchTool = defineTool({
  name: 'search',
  description: 'Search the web',
  paramsSchema: z.object({ query: z.string() }),
  handler: async ({ query }) => searchWeb(query)
});

// Auto-execute mode - library runs handlers and loops to completion
const result = await client.tools.run({
  model: 'llama3.2',
  messages: [user('Research TypeScript and summarize')],
  tools: [searchTool],
  autoExecute: true
});

Server Proxy (Express)

import express from 'express';
import { createExpressHandler } from 'ollama-kit/node';

const app = express();

app.use('/api/ollama', createExpressHandler({
  target: 'http://localhost:11434',
  allowedOrigins: ['http://localhost:3000'],
  enableCors: true
}));

app.listen(8080);

React Integration

import { createUseChatHook } from 'ollama-kit/react';
import { useState, useEffect, useCallback } from 'react';

// Create hook with React primitives (no React dependency in library)
const useChat = createUseChatHook({ useState, useEffect, useCallback });

function ChatComponent() {
  const { messages, send, isLoading } = useChat({
    apiUrl: 'http://localhost:8080/api/ollama',
    model: 'llama3.2'
  });

  return (
    <div>
      {messages.map(msg => <div key={msg.id}>{msg.content}</div>)}
      <button onClick={() => send('Hello!')} disabled={isLoading}>
        Send
      </button>
    </div>
  );
}

What's Next?

Coming in v1.1+:

  • 📚 Example applications (chat UI, RAG pipeline, agent workflows)
  • 📖 Cookbook with common patterns and recipes
  • 📝 Deep-dive documentation for advanced features
  • 🔌 More framework adapters (Fastify, Koa, etc.)
  • 🎯 Production tokenizers (tiktoken, transformers.js)

We Need Your Feedback!

🎉 ollama-kit v1.0.0 is feature-complete! This release includes all planned core functionality, but we need real-world testing to refine the APIs.

Early adopters: Please try it out and share:

  • 🐛 Bug reports for any issues you encounter
  • 💡 Feature requests for missing capabilities
  • 🔍 Edge cases that aren't handled well
  • 📖 Documentation gaps that confused you

Open an issue or start a discussion. Your feedback will shape v1.1!

Documentation

Requirements

  • Node.js 18+ — ESM support required
  • Ollama — Running locally or remotely (default: http://localhost:11434)
  • Zod 3.0+ (optional) — Required for schema-based validation
  • React 18+ (optional) — Only needed for ollama-kit/react features

Contributing

This project uses a documentation-driven workflow. All work is tracked in the /docs folder:

  • docs/index/MASTER_INDEX.md — Single source of truth for project state
  • docs/features/ — Feature specifications
  • docs/sprints/ — Time-boxed work periods
  • docs/decisions/ — Architecture decisions (ADRs)

License

MIT © Josh Templeton

See .github/copilot-instructions.md for AI agent guidance.

License

MIT © Josh Templeton