npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@kreuzberg/liter-llm-native

v1.1.1

Published

Native NAPI-RS bindings for liter-llm — high-performance LLM client powered by Rust

Downloads

731

Readme

TypeScript (Node.js)

Universal LLM API client for TypeScript and Node.js. Access 142+ LLM providers through a single interface with native NAPI-RS bindings, async/await support, streaming, tool calling, and full TypeScript type definitions.

Installation

Package Installation

Install via one of the supported package managers:

npm:

npm install @kreuzberg/liter-llm

pnpm:

pnpm add @kreuzberg/liter-llm

yarn:

yarn add @kreuzberg/liter-llm

System Requirements

  • Node.js 22+ required (NAPI-RS native bindings)
  • API keys via environment variables (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY)

Platform Support

Pre-built binaries available for:

  • macOS (arm64, x64)
  • Linux (x64)
  • Windows (x64)

Quick Start

Basic Chat

Send a message to any provider using the provider/model prefix:

import { LlmClient } from "@kreuzberg/liter-llm";

const client = new LlmClient({ apiKey: process.env.OPENAI_API_KEY! });
const response = await client.chat({
  model: "openai/gpt-4o",
  messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);

Common Use Cases

Streaming Responses

Stream tokens in real time:

import { LlmClient } from "@kreuzberg/liter-llm";

const client = new LlmClient({ apiKey: process.env.OPENAI_API_KEY! });
const chunks = await client.chatStream({
  model: "openai/gpt-4o",
  messages: [{ role: "user", content: "Tell me a story" }],
});

for (const chunk of chunks) {
  process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}
console.log();

Tool Calling

Define and invoke tools:

import { LlmClient } from "@kreuzberg/liter-llm";

const client = new LlmClient({ apiKey: process.env.OPENAI_API_KEY! });

const tools = [
  {
    type: "function" as const,
    function: {
      name: "get_weather",
      description: "Get the current weather for a location",
      parameters: {
        type: "object",
        properties: {
          location: { type: "string", description: "City name" },
        },
        required: ["location"],
      },
    },
  },
];

const response = await client.chat({
  model: "openai/gpt-4o",
  messages: [{ role: "user", content: "What is the weather in Berlin?" }],
  tools,
});

for (const call of response.choices[0]?.message?.toolCalls ?? []) {
  console.log(`Tool: ${call.function.name}, Args: ${call.function.arguments}`);
}

Next Steps

NAPI-RS Implementation Details

Native Performance

This binding uses NAPI-RS to provide native Node.js bindings with:

  • Zero-copy data transfer between JavaScript and Rust layers
  • Async by default — all LLM calls return Promises backed by Tokio
  • Binary-compatible pre-built native modules across platforms
  • TypeScript definitions generated automatically from Rust types

Threading Model

  • LLM calls are non-blocking — Tokio async runtime handles concurrency
  • Streaming responses use Node.js async iterators backed by Tokio streams
  • CPU-bound work runs in spawn_blocking to avoid blocking the event loop

Memory Management

  • API keys are wrapped in secrecy::SecretString and never logged
  • Streaming buffers are released as soon as each chunk is consumed
  • Provider registry is compiled into the binary — no runtime disk access

Features

Supported Providers (142+)

Route to any provider using the provider/model prefix convention:

| Provider | Example Model | |----------|--------------| | OpenAI | openai/gpt-4o, openai/gpt-4o-mini | | Anthropic | anthropic/claude-3-5-sonnet-20241022 | | Groq | groq/llama-3.1-70b-versatile | | Mistral | mistral/mistral-large-latest | | Cohere | cohere/command-r-plus | | Together AI | together/meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo | | Fireworks | fireworks/accounts/fireworks/models/llama-v3p1-70b-instruct | | Google Vertex | vertexai/gemini-1.5-pro | | Amazon Bedrock | bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0 |

Complete Provider List

Key Capabilities

  • Provider Routing -- Single client for 142+ LLM providers via provider/model prefix

  • Unified API -- Consistent chat, chat_stream, embeddings, list_models interface

  • Streaming -- Real-time token streaming via chat_stream

  • Tool Calling -- Function calling and tool use across all supporting providers

  • Type Safe -- Schema-driven types compiled from JSON schemas

  • Secure -- API keys never logged or serialized, managed via environment variables

  • Observability -- Built-in OpenTelemetry with GenAI semantic conventions

  • Error Handling -- Structured errors with provider context and retry hints

Performance

Built on a compiled Rust core for speed and safety:

  • Provider resolution at client construction -- zero per-request overhead
  • Configurable timeouts and connection pooling
  • Zero-copy streaming with SSE and AWS EventStream support
  • API keys wrapped in secure memory, zeroed on drop

Provider Routing

Route to 142+ providers using the provider/model prefix convention:

openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
mistral/mistral-large-latest

See the provider registry for the full list.

Proxy Server

liter-llm also ships as an OpenAI-compatible proxy server with Docker support:

docker run -p 4000:4000 -e LITER_LLM_MASTER_KEY=sk-your-key ghcr.io/kreuzberg-dev/liter-llm

See the proxy server documentation for configuration, CLI usage, and MCP integration.

Documentation

Part of kreuzberg.dev.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Join our Discord community for questions and discussion.

License

MIT -- see LICENSE for details.