npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@corelay/mesh-agent

v1.0.0

Published

Corelay Mesh Agent — zero-dependency telemetry SDK. Reports health, errors, LLM usage, and HTTP metrics to Corelay Command.

Readme

@corelay/agent

Zero-dependency telemetry SDK for AI products. Reports health, errors, custom metrics, LLM usage (OpenAI, Anthropic, Bedrock), and HTTP request metrics to a Corelay Command Centre over plain HTTP. Designed to survive transport failures without blocking the main thread.

npm install @corelay/agent

Requires Node.js 18+. TypeScript-native. No runtime dependencies.


Why it exists

Production AI services accumulate observability requirements quickly: health checks, cost tracking per provider and feature, structured errors, request-level metrics, LLM call latency. Most existing tools solve one of these well and the rest badly, and charge per seat per month.

@corelay/agent is the opposite. One init() call, five overlapping concerns covered, batched transport with cap-and-drop on failure, zero dependencies to audit.


Quick start

import { init } from '@corelay/agent';

const agent = init({
  commandCenterUrl: 'https://your-command-centre.example.com',
  productId: 'your-service-name',
  apiKey: process.env.CORELAY_API_KEY,
});

agent.start();

That's it. By default:

  • Heartbeats flush every 30s with uptime + memory
  • Uncaught exceptions and unhandled promise rejections are captured
  • Metrics and telemetry batches flush every 10s over HTTP
  • A graceful stop() drains the final batch

Express middleware

app.use(agent.expressMiddleware());

Tracks method, path, status code, and duration of every request. Aggregated at the Command Centre into P50 / P95 / P99 latency per route.


LLM tracking

Bedrock (auto-tracking)

import { BedrockRuntimeClient } from '@aws-sdk/client-bedrock-runtime';

const raw = new BedrockRuntimeClient({ region: 'eu-west-2' });
const bedrock = agent.wrapBedrock(raw);

// Every InvokeModel / Converse call is now tracked automatically.
// The wrapper preserves the original client's type signature.

OpenAI (auto-tracking)

import OpenAI from 'openai';

const raw = new OpenAI();
const openai = agent.wrapOpenAI(raw);

// chat.completions.create calls are tracked automatically.

Manual tracking

agent.trackLLM(
  'anthropic',      // provider
  'claude-sonnet-4', // model
  inputTokens,
  outputTokens,
  costUsd,
  durationMs,
  'summarise-case'  // feature tag (optional)
);

Costs accumulate at the Command Centre per product, per provider, per feature — useful for attributing LLM spend to individual product features.


Error tracking

Uncaught exceptions and unhandled rejections are captured automatically. For manual tracking:

agent.trackError(new Error('payment webhook failed'), { userId: 'u_123' });

Errors are batched with a local cap so a flood of errors can't blow up memory.


Custom metrics

agent.trackMetric('queue.depth', 42, { queue: 'emails' });
agent.trackMetric('safevoice.triage.latency_ms', 1234, { tenant: 'uk' });

Any numeric value with arbitrary tags. The Command Centre aggregates by name and tags.


Configuration

| Option | Type | Default | Description | |---|---|---|---| | commandCenterUrl | string | required | Base URL of your Command Centre deployment | | productId | string | required | Unique identifier for this service | | apiKey | string | — | Bearer token for authenticated Command Centre | | heartbeatInterval | number | 30000 | Health report cadence in ms | | flushInterval | number | 10000 | Metrics/errors/LLM batch flush cadence in ms | | enabled | boolean | true | Set false to noop everything (useful in tests) |


Design choices

  • Zero runtime dependencies. Nothing to audit, no supply-chain surprises. Pure TypeScript built to dual ESM + CJS output.
  • Batched HTTP transport with drop-on-failure. If the Command Centre is down, metrics are dropped rather than queued unboundedly. An in-memory cap prevents error floods from exhausting heap.
  • Non-blocking timers. Flush timers use .unref() so they don't keep the process alive on shutdown.
  • Type-preserving LLM wrappers. wrapBedrock<T extends object>(client: T): T returns the same type as the input. Instrumentation is invisible at the type level.
  • Product-agnostic. One SDK, any Node.js service. Used today in five production AI products (SafeVoice, Endorsd, Keepa, Corelay VIP, BuildWithAI).

Architecture

┌──────────────────────────────────────────┐
│            Your Node.js Service            │
│                                            │
│  ┌────────────────────────────────────┐  │
│  │       @corelay/agent               │  │
│  │                                    │  │
│  │  ┌──────────────────────────────┐ │  │
│  │  │ HealthCollector (30s timer)  │ │  │
│  │  │ ErrorCollector (hooks proc)  │ │  │
│  │  │ LLMTracker (wraps clients)   │ │  │
│  │  │ RequestTracker (middleware)  │ │  │
│  │  └──────────────────────────────┘ │  │
│  │             │                      │  │
│  │             ▼                      │  │
│  │  ┌──────────────────────────────┐ │  │
│  │  │ HttpTransport (batched POST) │ │  │
│  │  └──────────────────────────────┘ │  │
│  └────────────────────────────────────┘  │
└────────────────────┬────────────────────┘
                     │  POST /telemetry (every flushInterval)
                     ▼
       ┌──────────────────────────────┐
       │     Corelay Command Centre    │
       │  (aggregation, dashboards,    │
       │   AI-assisted diagnosis)      │
       └──────────────────────────────┘

Source layout:

src/
├── collectors/
│   ├── health.ts        # Heartbeats
│   ├── errors.ts        # Uncaught + manual errors
│   ├── llm.ts           # Bedrock + OpenAI wrappers, manual tracking
│   └── requests.ts      # Express middleware
├── transports/
│   └── http.ts          # Batched HTTP POST with retry backoff
├── types.ts             # All public types
├── register.ts          # One-time registration handshake
└── index.ts             # Public API

Graceful shutdown

process.on('SIGTERM', async () => {
  await agent.stop(); // Flushes final batch, clears timers
  process.exit(0);
});

Production use

@corelay/agent is embedded in every Corelay-managed product:

  • SafeVoice — AI crisis response for DA/GBV survivors (UK + Nigeria)
  • Endorsd — AI advisor for UK Global Talent Visa applicants
  • Keepa — ML credit scoring for informal-economy traders
  • Corelay VIP — vehicle intelligence for Nigeria (Apache Flink CEP)
  • BuildWithAI — AI school for non-technical Nigerian learners

Each product sends telemetry to a shared Corelay Command Centre, which provides cross-product observability and AI-assisted incident diagnosis.


Contributing

Contributions welcome. See CONTRIBUTING.md.

License

MIT — see LICENSE.

© 2026 Corelay Ltd (UK).