npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@node-llm/monitor

v0.3.0

Published

Production-grade observability and monitoring for NodeLLM.

Downloads

49

Readme

NodeLLM Monitor 🛰️

Advanced, infrastructure-first monitoring for NodeLLM.

Dashboard Metrics View

Token Analytics

Dashboard Traces View

Features

  • 📊 Real-time Metrics - Track requests, costs, latency, and error rates
  • 🔍 Request Tracing - Detailed execution flow with tool calls
  • 💰 Cost Analysis - Per-provider and per-model cost breakdown
  • 📈 Time Series Charts - Visualize trends over time
  • 🔌 Pluggable Storage - Memory, File, or Prisma adapters
  • 🛡️ Privacy First - Content scrubbing and PII protection

Setup

1. Database Schema (Prisma)

Add the following model to your schema.prisma:

model monitoring_events {
  id             String   @id @default(uuid())
  eventType      String   // request.start, request.end, tool.start, etc.
  requestId      String   @index
  sessionId      String?  @index
  transactionId  String?  @index
  time           DateTime @default(now())
  duration       Int?     // duration in ms
  cost           Float?
  cpuTime        Float?
  gcTime         Float?
  allocations    Int?
  payload        Json     // Stores metadata, tokens and optional content
  createdAt      DateTime @default(now())
  provider       String
  model          String
}

Then run the migration to create the table:

npx prisma migrate dev --name add_monitoring_events

Note: For non-Prisma users, a raw SQL migration is available at migrations/001_create_monitoring_events.sql.

2. Integration

import { createLLM } from "@node-llm/core";
import { createPrismaMonitor } from "@node-llm/monitor";
import { prisma } from "./db";

// Create monitor with Prisma storage
const monitor = createPrismaMonitor(prisma, {
  captureContent: true // Optional: capture prompts/responses (scrubbed by default)
});

// Attach monitor as middleware - it automatically tracks all requests
const llm = createLLM({
  provider: "openai",
  model: "gpt-4o-mini",
  middlewares: [monitor]
});

3. Adapters (Memory / File)

NodeLLM Monitor includes built-in adapters for development and logging.

import { Monitor, createFileMonitor } from "@node-llm/monitor";

// 1. In-Memory (Great for Dev/CI)
const memoryMonitor = Monitor.memory();

// 2. File-based (Persistent JSON log)
const fileMonitor = createFileMonitor("./monitoring.log");

Pluggable Storage (Non-Prisma)

While @node-llm/monitor provides a first-class Prisma adapter, it is designed with a pluggable architecture. You can use any database (PostgreSQL, SQLite, Redis, etc.) by implementing the MonitoringStore interface.

1. Manual Table Creation

If you aren't using Prisma, use our raw SQL migration: migrations/001_create_monitoring_events.sql

2. Implement Custom Store

import { Monitor, MonitoringStore, MonitoringEvent, Stats } from "@node-llm/monitor";

class CustomStore implements MonitoringStore {
  async saveEvent(event: MonitoringEvent): Promise<void> {
    // Your DB logic here: INSERT INTO monitoring_events ...
  }

  async getStats(filter?: { from?: Date }): Promise<Stats> {
    // Return aggregated stats for the dashboard
    return {
      totalRequests: 0,
      totalCost: 0,
      avgDuration: 0,
      errorRate: 0
    };
  }
}

const monitor = new Monitor({ store: new CustomStore() });

Dashboard

NodeLLM Monitor includes a high-performance built-in dashboard for real-time observability.

Metrics View

Track total requests, costs, response times, and error rates at a glance. View usage breakdown by provider and model with interactive time-series charts.

Metrics Dashboard

Traces View

Inspect individual requests with full execution flow, including tool calls, timing, and request/response content.

Traces Dashboard

Launch the Dashboard

import express from "express";
import { PrismaClient } from "@prisma/client";
import { MonitorDashboard } from "@node-llm/monitor/ui";

const prisma = new PrismaClient();
const app = express();

// Create dashboard - pass Prisma client or any MonitoringStore
// Dashboard handles its own routing under basePath
app.use(
  createMonitorMiddleware(prisma, {
    basePath: "/monitor",
    cors: false
  })
);

// OR use the ergonomic shorthand from a monitor instance:
app.use(monitor.api({ basePath: "/monitor" }));

app.listen(3000, () => {
  console.log("Dashboard available at http://localhost:3000/monitor");
});

Operational Metadata

Capture granular operational metrics without changing execution semantics:

import { Monitor, createPrismaMonitor } from "@node-llm/monitor";

const monitor = createPrismaMonitor(prisma);

// Enrich with environment context
let payload = monitor.enrichWithEnvironment(
  {},
  {
    serviceName: "hr-api",
    environment: "production"
  }
);

// Add timing breakdown for debugging
payload = monitor.enrichWithTiming(payload, {
  queueTime: 5,
  networkTime: 45,
  providerLatency: 850
});

// Track retries for reliability analysis
payload = monitor.enrichWithRetry(payload, {
  retryCount: 2,
  retryReason: "rate_limit"
});

Generic Usage (Non-NodeLLM)

While optimized as a native middleware for NodeLLM, the monitor is a generic telemetry engine. You can use it manually with any library (Vercel AI SDK, LangChain, or raw OpenAI):

import { Monitor } from "@node-llm/monitor";
import type { MinimalContext } from "@node-llm/monitor";

const monitor = Monitor.memory();

// Create context object (implements MinimalContext interface)
const ctx: MinimalContext = {
  requestId: "req_123",
  provider: "openai",
  model: "gpt-4o",
  state: {} // Required for metrics tracking
};

// 1. Start tracking
await monitor.onRequest(ctx);

// 2. Track tool calls (optional)
await monitor.onToolCallStart(ctx, { id: "call_1", function: { name: "get_weather" } });
await monitor.onToolCallEnd(ctx, { id: "call_1" }, "22°C");

// 3. Finalize with result
await monitor.onResponse(ctx, {
  toString: () => "The weather is 22°C",
  usage: { input_tokens: 100, output_tokens: 50, cost: 0.002 }
});

OpenTelemetry Support

For zero-code instrumentation of libraries like Vercel AI SDK, use our OpenTelemetry bridge:

import { NodeLLMSpanProcessor } from "@node-llm/monitor-otel";
import { Monitor } from "@node-llm/monitor";

const monitor = Monitor.memory();
const provider = new NodeTracerProvider();
provider.addSpanProcessor(new NodeLLMSpanProcessor(monitor.getStore()));
provider.register();

See @node-llm/monitor-otel for more details.

Privacy

By default, captureContent is false. This ensures that Personal Identifiable Information (PII) is not persisted in your monitoring logs unless explicitly enabled for debugging.