npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

shuntly

v0.8.0

Published

A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code

Readme

Shuntly

| | CI | Package | |---|---|---| | Python | CI | PyPI | | TypeScript | CI | NPM |

A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code.

Shuntly wraps LLM SDKs to record every request and response as JSON. Calling shunt() wraps and returns a client with its original interface and types preserved, permitting consistent IDE autocomplete and type checking. Shuntly provides a collection of configurable "sinks" to write records to stderr, files, named pipes, or any combination.

While debugging LLM tooling, maybe you want to see exactly what is being sent and returned. When launching an agent, maybe you want to record every call to the LLM. Shuntly can capture it all without TLS interception, a proxy or web-based platform, or complicated logging infrastructure.

Install

npm install shuntly

Integrate

Given an LLM SDK (e.g. @anthropic-ai/sdk, openai, @google/genai), simply call shunt() with the instantiated SDK class. The returned object has the same type and interface.

import Anthropic from "@anthropic-ai/sdk";
import { shunt } from "shuntly";

// Without providing a sink Shuntly output goes to stderr
const client = shunt(new Anthropic({ apiKey: API_KEY }));

// Now use the client as before
const message = await client.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello" }],
});

Each call to messages.create() writes a complete JSON record:

{
  "timestamp": "2025-01-15T12:00:00.000Z",
  "hostname": "dev1",
  "user": "alice",
  "pid": 42,
  "client": "Anthropic",
  "method": "messages.create",
  "request": {
    "model": "claude-sonnet-4-20250514",
    "max_tokens": 1024,
    "messages": [{ "role": "user", "content": "Hello" }]
  },
  "response": {
    "id": "msg_...",
    "content": [{ "type": "text", "text": "Hi!" }]
  },
  "durationMs": 823.4,
  "error": null
}

Diversify

Shuntly presently supports the following SDKs and clients:

| Client | Package | Methods | | ------------- | -------------------------------------------------------- | -------------------------------------------------------- | | Anthropic | npm | messages.create, messages.stream | | OpenAI | npm | chat.completions.create | | GoogleGenAI | npm | models.generateContent, models.generateContentStream | | Ollama | npm | chat, generate |

Shuntly also supports wrapping standalone functions, such as those from @mariozechner/pi-ai:

| Function | Description | | ---------------- | ------------------------------------------------ | | complete | Non-streaming completion (returns Promise) | | completeSimple | Non-streaming with reasoning options | | stream | Streaming completion (returns async iterable) | | streamSimple | Streaming with reasoning options |

import { complete, completeSimple, stream, streamSimple } from "@mariozechner/pi-ai";
const complete = shunt(complete, sink);
const completeSimple = shunt(completeSimple, sink);
const stream = shunt(stream, sink);
const streamSimple = shunt(streamSimple, sink);

For anything else, method paths can be explicitly provided:

const client = shunt(myClient, null, ["chat.send", "embeddings.create"]);

View

Shuntly JSON output can be streamed or read with a JSON viewer like fx. These tools provide JSON syntax highlighting and collapsible sections.

View Realtime Shuntly from stderr

Shuntly output, by default, goes to stderr; this is equivalent to providing a SinkStream to shunt():

import { shunt, SinkStream } from "shuntly";
const client = shunt(new Anthropic({ apiKey: API_KEY }), new SinkStream());

Given a command, you can view Shuntly stderr output in fx with the following:

$ command 2>&1 >/dev/null | fx

View Realtime Shuntly via a Pipe

To view Shuntly output via a named pipe in another terminal, the SinkPipe sink can be used. First, name the pipe when providing SinkPipe to shunt():

import { shunt, SinkPipe } from "shuntly";
const client = shunt(
  new Anthropic({ apiKey: API_KEY }),
  new SinkPipe("/tmp/shuntly.fifo"),
);

Then, in a terminal to view Shuntly output, create the named pipe and provide it to fx

$ mkfifo /tmp/shuntly.fifo; fx < /tmp/shuntly.fifo

Then, in another terminal, launch your command.

View Shuntly from a File

To store Shuntly output in a file, the SinkFile sink can be used. Name the file when providing SinkFile to shunt():

import { shunt, SinkFile } from "shuntly";
const client = shunt(
  new Anthropic({ apiKey: API_KEY }),
  new SinkFile("/tmp/shuntly.jsonl"),
);

Then, after your command is complete, view the file:

$ fx /tmp/shuntly.jsonl

Store Shuntly Output with File Rotation

For long-running applications, SinkRotating writes JSONL records to a directory with automatic file rotation and cleanup. Files are named with UTC timestamps (e.g. 2025-02-15T210530.482Z.jsonl).

import { shunt, SinkRotating } from "shuntly";
const client = shunt(new Anthropic({ apiKey: API_KEY }), new SinkRotating("/tmp/shuntly"));

When a file exceeds maxBytesFile (default 10 MB), a new file is created. When the directory exceeds maxBytesDir (default 100 MB), the oldest files are pruned. Set maxBytesDir: 0 to disable pruning and retain all files. Both limits are configurable:

const client = shunt(
  new Anthropic({ apiKey: API_KEY }),
  new SinkRotating("/tmp/shuntly", {
    maxBytesFile: 50 * 1024 * 1024, // 50 MB per file
    maxBytesDir: 500 * 1024 * 1024, // 500 MB total
  }),
);

Send Shuntly Output to Multiple Sinks

Using SinkMany, multiple sinks can be written to simultaneously.

import { shunt, SinkStream, SinkFile, SinkMany } from "shuntly";

const client = shunt(
  new Anthropic(),
  new SinkMany([new SinkStream(), new SinkFile("/tmp/shuntly.jsonl")]),
);

Custom Sinks

Custom sinks can be implemented by implementing the Sink interface:

import { Sink, ShuntlyRecord } from "shuntly";

class SinkConsole implements Sink {
  write(record: ShuntlyRecord): void {
    console.log(record.client, record.method, record.durationMs);
  }
  close(): void {}
}

What is New in Shuntly

0.8.0

Added support for Ollama interfaces chat and generate.

Added SinkRotating for rotating log handling.

Improved implementation of wrapAsyncIterable.

0.7.0

Added support for pi-ai interfaces completeSimple and streamSimple.

0.6.0

Added support for pi-ai interfaces complete and stream.

0.5.0

Corrected interleaved writes in SinkPipe.

0.4.0

Added README.md, ci.yml, and additional configuration.

0.3.0

Initial release.