npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cogslayer

v0.1.1

Published

Lightweight TypeScript SDK for LLM cost tracking and attribution.

Readme

CogsLayer TypeScript SDK

Know the cost of every LLM call your Node app makes. CogsLayer tracks tokens, estimated cost, latency, provider, model, feature, team, user, and session context.

The SDK is ESM-only and requires Node 20 or newer.

Install

npm install cogslayer

Install the provider SDKs you use:

npm install openai
npm install @anthropic-ai/sdk
npm install @google/genai

Quick Start

import { init, track } from "cogslayer";
import { OpenAI } from "cogslayer/openai";

await init({
  apiKey: process.env.COGSLAYER_API_KEY!,
  service: "my-api",
  environment: "production",
});

const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

export async function ask(prompt: string): Promise<string> {
  return track({ feature: "chat", team: "growth" }, async () => {
    const response = await client.chat.completions.create({
      model: "gpt-4o",
      messages: [{ role: "user", content: prompt }],
    });

    return response.choices[0]?.message?.content ?? "";
  });
}

Public API

The TypeScript SDK uses camelCase names only.

import {
  checkAlerts,
  estimate,
  exportCsv,
  getContext,
  getCostSummary,
  getInsights,
  getTimeseries,
  init,
  registerModel,
  resetContext,
  session,
  setContext,
  shutdown,
  track,
} from "cogslayer";

Use apiKey, baseUrl, flushInterval, userId, requestId, sessionId, and groupBy. There are no snake_case aliases in the TypeScript package.

Provider Clients

Provider wrappers live in separate entry points, so your app only loads the provider SDKs it imports.

import { OpenAI } from "cogslayer/openai";
import { Anthropic } from "cogslayer/anthropic";
import { Client as GeminiClient } from "cogslayer/gemini";

The wrappers subclass the provider SDKs and preserve provider method signatures, including request options such as abort signals, headers, timeouts, and retries.

OpenAI

import { OpenAI } from "cogslayer/openai";

const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const response = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello" }],
});

Tracks chat.completions.create(), chat.completions.parse(), chat.completions.stream(), responses.create(), and supported audio APIs.

OpenAI-Compatible Providers

const groq = new OpenAI({
  baseURL: "https://api.groq.com/openai/v1",
  apiKey: process.env.GROQ_API_KEY,
  provider: "groq",
});

Use provider when an OpenAI-compatible API should be reported under its own provider name.

Anthropic

import { Anthropic } from "cogslayer/anthropic";

const client = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });

const message = await client.messages.create({
  model: "claude-sonnet-4-5",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello" }],
});

Tracks messages.create() and messages.stream(). Cache read and cache creation tokens are captured when Anthropic returns them.

Google Gemini

import { Client as GeminiClient } from "cogslayer/gemini";

const client = new GeminiClient({ apiKey: process.env.GEMINI_API_KEY });

const response = await client.models.generateContent({
  model: "gemini-2.0-flash",
  contents: "Hello",
});

Tracks models.generateContent() and stream responses that expose usage metadata.

Attribution

Use track(context, fn) to attach feature, team, user, tenant, service, request, or custom metadata to every provider call made inside a scoped block.

export async function summarize(text: string) {
  return track({
    feature: "summaries",
    team: "growth",
    userId: "u_123",
    tenant: "acme",
  }, async () => {
    return client.chat.completions.create({
      model: "gpt-4o-mini",
      messages: [{ role: "user", content: text }],
    });
  });
}

If your project uses TypeScript 5 standard decorators, track(context) also works as a method decorator:

class Summarizer {
  @track({ feature: "summaries", team: "growth" })
  async summarize(text: string) {
    return client.chat.completions.create({
      model: "gpt-4o-mini",
      messages: [{ role: "user", content: text }],
    });
  }
}

Context is propagated with Node AsyncLocalStorage, so attribution follows async work created inside the tracked scope.

Sessions

Use session() to group multi-step agent work under one session id.

await session("support-agent").run(async () => {
  await client.chat.completions.create({
    model: "gpt-4o",
    messages: [{ role: "user", content: "Plan the answer" }],
  });

  await client.chat.completions.create({
    model: "gpt-4o-mini",
    messages: [{ role: "user", content: "Draft the reply" }],
  });
});

Streaming

Streaming is supported for OpenAI, Anthropic, and Gemini surfaces that expose final usage metadata. For OpenAI streaming calls, CogsLayer adds stream_options.include_usage when needed.

const stream = await client.chat.completions.create({
  model: "gpt-4o",
  messages: [{ role: "user", content: "Hello" }],
  stream: true,
});

for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}

What Gets Tracked

| Field | Description | |---|---| | provider | Provider name, such as openai, anthropic, gemini, or a custom OpenAI-compatible provider | | model | Model name returned by the provider | | prompt_tokens | Input tokens | | completion_tokens | Output tokens | | total_tokens | Input plus output tokens | | cost_usd | Estimated cost from the local or platform pricing registry | | latency_ms | Provider call duration | | is_stream | Whether the call used streaming | | reasoning_tokens | Reasoning tokens when returned by the provider | | cached_tokens | Cached input tokens when returned by the provider | | has_tool_calls | Whether the response included tool calls | | api_type | API surface, such as chat, responses, messages, generate, or audio types |

Attribution fields from track(), setContext(), init(), and session() are sent with each event.

Platform Helpers

After init(), helper methods query the CogsLayer API with your SDK key.

const summary = await getCostSummary({ groupBy: "model" });
const timeseries = await getTimeseries({ interval: "day", groupBy: "feature" });
const insights = await getInsights({ days: 30 });
const alerts = await checkAlerts();
const csv = await exportCsv({ limit: 10_000 });

Custom Pricing

Register prices for fine-tuned or internal models. Prices are per 1,000 tokens.

import { registerModel } from "cogslayer";

registerModel("my-fine-tuned-model", 0.006, 0.012);

const estimateResult = estimate({
  model: "my-fine-tuned-model",
  prompt: "Summarize this support ticket",
  maxTokens: 300,
});

Shutdown

Call shutdown() before process exit when you want queued events flushed before the process ends.

await shutdown();

Running Locally

npm install
npm test

The test command typechecks, builds, and runs a runtime smoke test against the compiled package output.

Publishing

The package includes a publish guard:

npm publish --access public

prepublishOnly runs npm test, so publish fails if typechecking, build, or the smoke test fails.