npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@sisu-ai/adapter-ollama

v9.0.0

Published

Ollama Chat adapter with native tools support.

Readme

@sisu-ai/adapter-ollama

Ollama Chat adapter with native tools support.

Tests CodeQL License Downloads PRs Welcome

Setup

npm i @sisu-ai/adapter-ollama
  • Start Ollama locally: ollama serve
  • Pull a tools-capable model: ollama pull llama3.1:latest

Usage

import { ollamaAdapter } from '@sisu-ai/adapter-ollama';

const model = ollamaAdapter({ model: 'llama3.1' });
// or with custom base URL: { baseUrl: 'http://localhost:11435' }

Images (Vision)

  • Accepts multi-part content arrays with type: 'text' | 'image_url' and convenience fields like images/image_url.
  • The adapter maps these to Ollama's expected shape by sending content as a string and images as a string array on the message.
  • If an image value is an http(s) URL, the adapter fetches it and inlines it as base64 automatically. Data URLs are supported; raw base64 strings pass through.

Content parts (adapter maps to images[] under the hood and auto-fetches URLs):

const messages: any[] = [
  { role: 'user', content: [
    { type: 'text', text: 'What is in this image?' },
    { type: 'image_url', image_url: { url: 'https://example.com/pic.jpg' } },
  ] }
];
const res = await model.generate(messages, { toolChoice: 'none' });

Convenience shape:

const messages: any[] = [
  { role: 'user', content: 'Describe the image.', images: ['https://example.com/pic.jpg'] },
];
const res = await model.generate(messages, { toolChoice: 'none' });

Normalizing Ollama API

  • Providers such as OpenAI vision models accepts image_url parts with url pointing to a remote image; the provider dereferences the URL.
  • Ollama expects each message to include images: string[] of base64-encoded image data; it does not dereference remote URLs.
  • This adapter keeps the authoring experience consistent by accepting OpenAI-style parts and convenience URLs, and performs URL→base64 conversion for you.

Accepted image formats

  • Base64 string: images: ["<base64>"] (preferred/default for Ollama)
  • Data URL: images: ["data:image/png;base64,<base64>"] or in parts via { type: 'image_url', image_url: { url: 'data:...' } }
  • Remote URL (convenience): { type: 'image_url', image_url: { url: 'https://...' } } or images: ['https://...'] — adapter fetches and inlines as base64.

Note: URL fetching happens from your runtime. If your environment blocks outbound HTTP, either provide base64 directly or host images where your runtime can reach them.

Tools

  • Define tools as small, named functions with a zod schema.
  • Register them on your agent and add the tool-calling middleware — the adapter handles the wire format to/from Ollama.
  • Under the hood, the adapter sends your tool schemas to the model, maps model “function calls” back to your handlers, and includes tool results for follow‑up turns.

Quick start with tools

import { Agent, InMemoryKV, NullStream, SimpleTools, createConsoleLogger, type Ctx, type Tool } from '@sisu-ai/core';
import { registerTools } from '@sisu-ai/mw-register-tools';
import { toolCalling } from '@sisu-ai/mw-tool-calling';
import { z } from 'zod';
import { ollamaAdapter } from '@sisu-ai/adapter-ollama';

const sum: Tool<{ a: number; b: number }> = {
  name: 'sum',
  description: 'Add two numbers',
  schema: z.object({ a: z.number(), b: z.number() }),
  handler: async ({ a, b }) => ({ result: a + b }),
};

const model = ollamaAdapter({ model: 'llama3.1' });
const ctx: Ctx = {
  input: 'Use the sum tool to add 3 and 7, then explain.',
  messages: [{ role: 'system', content: 'You are helpful.' }],
  model,
  tools: new SimpleTools(),
  memory: new InMemoryKV(),
  stream: new NullStream(),
  state: {},
  signal: new AbortController().signal,
  log: createConsoleLogger(),
};

const app = new Agent()
  .use(registerTools([sum])) // make tools available
  .use(toolCalling);         // let the model pick tools, run them, and finalize

await app.handler()(ctx);

Notes

  • Tool choice forcing is model-dependent; current loop asks for tools on first turn and plain completion on second.
  • Streaming can be added via Ollama's streaming API if desired.
  • Env: OLLAMA_BASE_URL or BASE_URL can override the base URL (or pass baseUrl in code). Examples may also support a CLI flag --base-url to override env.

Community & Support

Discover what you can do through examples or documentation. Check it out at https://github.com/finger-gun/sisu. Example projects live under examples/ in the repo.