npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

harper-fabric-onnx

v0.3.0

Published

ONNX Runtime embedding wrapper for Harper Fabric. Runs inference in a dedicated child process — one model instance, one thread pool, no global state races.

Downloads

504

Readme

harper-fabric-onnx

ONNX Runtime embedding wrapper for Harper. Runs inference in a dedicated child process — one model instance, one thread pool, no global state races across Harper workers.

Same public API as harper-fabric-embeddings so harper-kb can swap backends by changing one import.

Install

npm install harper-fabric-onnx

Requires Node.js 22+.

Usage

import {
  downloadModel,
  init,
  embed,
  embedBatch,
  dimensions,
  dispose,
} from "harper-fabric-onnx";

// Download model files (one-time)
await downloadModel(".models");

// Initialize — spawns child process, loads ONNX model
await init({ modelsDir: ".models" });

// Single embedding (768-dim, L2-normalized)
const vec = await embed("Hello world");

// Query-optimized embedding (uses "search_query:" prefix)
const queryVec = await embed("What is Harper?", "query");

// Batch embedding
const vecs = await embedBatch(["First text", "Second text"]);

// Get model dimensions
dimensions(); // 768

// Cleanup
await dispose();

How it works

ONNX Runtime has a global singleton and process-wide thread pool, so it can't safely run per-worker in Harper's multi-worker architecture. This package runs ONNX in a dedicated child process and routes all worker calls to it via a Unix domain socket.

  • init() — spawns a child process (or connects to an existing one), loads the ONNX model and tokenizer
  • embed() / embedBatch() — sends text over the socket, child tokenizes + runs inference, returns L2-normalized vectors
  • One child process is shared across all Harper worker threads
  • Stale process detection via PID files — auto-recovers from crashes

Supported models

| Model | Repo | Dimensions | | ---------------------------- | -------------------------------- | ---------- | | nomic-embed-text (default) | nomic-ai/nomic-embed-text-v1.5 | 768 | | nomic-embed-text-v2-moe | nomic-ai/nomic-embed-text-v2-moe | 768 |

API

downloadModel(dir: string, modelName?: string): Promise<string>

Downloads model and tokenizer files from HuggingFace. Returns the model directory path.

init(options: InitOptions): Promise<void>

Spawns the child process and loads the model. Options:

  • modelsDir — directory containing model subdirectories (e.g., .models)
  • modelPath — direct path to a specific model directory
  • modelName — model name from the registry (default: nomic-embed-text)

embed(text: string, type?: 'document' | 'query'): Promise<number[]>

Returns an L2-normalized embedding vector. Default type is 'document'.

embedBatch(texts: string[], type?: 'document' | 'query'): Promise<number[][]>

Returns an array of L2-normalized embedding vectors.

dimensions(): number

Returns the dimensionality of the loaded model (e.g., 768).

dispose(): Promise<void>

Shuts down the child process and cleans up socket/PID files.

Harper component usage

// resources.js
const { Resource } = globalThis;

export class Embed extends Resource {
  static loadAsInstance = false;

  async post(_query, data) {
    const { init, embed, embedBatch, dimensions } =
      await import("harper-fabric-onnx");
    await init({ modelsDir: process.env.ONNX_MODELS_DIR });

    if (data.texts) {
      const vecs = await embedBatch(data.texts);
      return { dimensions: dimensions(), vectors: vecs };
    }

    const vec = await embed(data.text, data.type);
    return { dimensions: dimensions(), vector: vec };
  }
}

License

MIT