npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

cenglu

v2.2.0

Published

Fast, zero-dependencies, secure logger for Node.js that doesn't suck. Built for production, not resumes.

Readme

cenglu

CI Security Audit CodeQL Release Size Limit

Fast, zero-dependencies, and secure logger for Node.js that doesn't suck. Built for production, not resumes.

npm install cenglu
import { createLogger } from "cenglu";

const logger = createLogger({
  service: "my-app",
  level: "info",
});
logger.info("server started", { port: 3000 });

Why another logger?

  • You're tired of heavy, leaky, or hard-to-configure loggers.
  • cenglu focuses on security (built-in redaction), performance, and a small, predictable API.

Quick start

Basic usage

import { createLogger } from "cenglu";

// Create a logger
const logger = createLogger({
  service: "my-app",
  level: "info",
});

// Basic logging
logger.info("Application started", { port: 3000 });
logger.warn("Deprecated API called", { endpoint: "/v1/users" });
logger.error("Failed to connect", new Error("Connection refused"));

// Child loggers for request context (shares transports/config)
const requestLogger = logger.child({ requestId: "abc-123" });
requestLogger.info("Processing request");

// Lightweight bound logger for temporary bindings
logger.with({ userId: 123 }).info("User action", { action: "login" });

// Timer for measuring durations
const done = logger.time("database-query");
await db.query("SELECT * FROM users");
done(); // Logs: "database-query completed" { durationMs: 42 }

// Timer result helpers
done.endWithContext({ rowCount: 10 });
const ms = done.elapsed();

Pretty logs for development

const logger = createLogger({
  pretty: { enabled: process.env.NODE_ENV !== "production" },
});

// Outputs colored, formatted logs in dev; JSON in production

Stop leaking secrets

const logger = createLogger({
  redaction: { enabled: true },
});

logger.info("user registered", {
  email: "[email protected]",
  password: "super-secret", // -> redacted
  creditCard: "4242-4242-4242", // -> redacted
  apiKey: "sk_live_abc123", // -> redacted
});

Real-world examples

Express app with request tracking

import { createLogger, expressMiddleware } from "cenglu";

const logger = createLogger({ service: "api" });

app.use(
  expressMiddleware(logger, {
    logRequests: true,
    logResponses: true,
  })
);

app.post("/payment", (req, res) => {
  // req.logger is bound to the request (includes requestId)
  req.logger.info("processing payment", { amount: req.body.amount });

  try {
    const result = processPayment(req.body);
    req.logger.info("payment successful", { transactionId: result.id });
    res.json(result);
  } catch (error) {
    req.logger.error("payment failed", error);
    res.status(500).json({ error: "Payment failed" });
  }
});

Microservice with distributed tracing

const logger = createLogger({
  service: "order-service",
  correlationId: () => crypto.randomUUID(),
  traceProvider: () => ({ traceId: /* from tracer */ "", spanId: "" }),
});

// All logs include correlation ID automatically
async function processOrder(order) {
  const orderLogger = logger.child({ orderId: order.id });

  orderLogger.info("processing order");
  await validateInventory(order);
  orderLogger.info("inventory validated");

  await chargePayment(order);
  orderLogger.info("payment processed");

  return order;
}

High-volume service with batching (adapter example)

import { createLogger, createHttpTransport, createBufferedTransport } from "cenglu";

// Build a buffered transport to send batches to Datadog
const transport = createBufferedTransport(
  createHttpTransport("datadog", {
    apiKey: process.env.DD_API_KEY,
  }),
  {
    bufferSize: 1000,
    flushInterval: 5000,
    maxBatchSize: 100,
  }
);

const logger = createLogger({
  level: "info",
  adapters: [
    {
      name: "datadog-adapter",
      // adapters can be sync or async (return Promise)
      handle: async (record) => transport.write(record, JSON.stringify(record), false),
    },
  ],
});

Features that matter

Security-first redaction

  • Built-in patterns for common secrets (credit cards, emails, JWTs, API keys, passwords).
  • Redaction applies to msg, context, and err via the Redactor.
  • redaction options support paths, patterns, and a customRedactor function.

Fast and predictable

  • Designed for high throughput and low overhead.
  • Sampling support to reduce verbose log volume.

Change log level without restart

  • Programmatically: logger.setLevel("debug") and logger.getLevel().
  • logger.isLevelEnabled(level) is provided to guard expensive computations.

Ship to any backend

  • Adapters and transports let you forward logs to custom destinations.
  • Adapters may include an optional level to filter which records they receive.

Plugin system

  • Plugins are initialized in order and may implement hooks:
    • onInit(logger)
    • onRecord(record) — can return null to drop a record, return a transformed record, or return undefined to leave it unchanged.
    • onFormat(record, formatted) — may replace formatted output
    • onWrite(record, formatted)
    • onFlush()
    • onClose()
  • Plugin errors are caught and written to stderr; they don't crash the process.

File transport configuration & rotation

  • File transport is disabled by default (enable with options or env).
  • Rotation options can be set via environment variables:
    • LOG_ROTATE_DAYS — rotation interval in days
    • LOG_MAX_BYTES — maximum bytes before rotation
    • LOG_MAX_FILES — number of rotated files to keep
    • LOG_COMPRESS — "gzip" to compress rotated files, "false"/"0" to disable compression
    • LOG_RETENTION_DAYS — how long to keep rotated logs
    • LOG_DIR — directory to write logs
  • The file transport supports writing a separate errors file when configured.

API reference

createLogger(options?)

const logger = createLogger({
  level: "info", // trace|debug|info|warn|error|fatal
  service: "my-app",
  version: "1.0.0",
  env: "production",

  // Redaction
  redaction: {
    enabled: true,
    paths: ["password"],
    patterns: [{ pattern: /secret/gi, replacement: "[SECRET]" }],
  },

  // Structured output
  structured: {
    type: "json", // json|ecs|datadog|splunk|logfmt
    transform: (record) => ({ ...record, extra: true }), // optional transform before stringifying
  },

  // Pretty printing
  pretty: {
    enabled: true,
    theme: {},
    formatter: (record) => String(record.msg),
  },

  // Sampling
  sampling: {
    rates: { trace: 0.1, debug: 0.5 },
    defaultRate: 1,
  },

  correlationId: () => crypto.randomUUID(),
  traceProvider: () => ({ traceId: "abc", spanId: "def" }),

  // Test helpers
  now: Date.now,
  random: Math.random,
  useAsyncContext: true,

  adapters: [{ name: "my-adapter", handle: (record) => {/* ... */} }],
  transports: [/* Transport instances */],
  plugins: [/* LoggerPlugin instances */],
});

Logger instance methods

  • logger.trace(message, [context], [error])

  • logger.debug(message, [context], [error])

  • logger.info(message, [context], [error])

  • logger.warn(message, [context], [error])

  • logger.error(message, [context], [error])

  • logger.fatal(message, [context], [error])

  • logger.with(context) — returns a lightweight BoundLogger that binds context for single-call convenience.

  • logger.child(bindings) — creates a child Logger that shares transports/config but merges new bindings into the logger state.

    • Child loggers share resources; do not close() child loggers directly — close the parent.
  • logger.logAt(level, msg, context?) — dynamic-level logging.

  • logger.ifTrace(fn), logger.ifDebug(fn), logger.ifInfo(fn) — conditional helpers that run fn only if that level is enabled. fn should return [msg, context?].

  • logger.time(label, context?) — returns a callable timer that logs the completed duration when called. The timer also exposes:

    • .end() — same as calling the timer
    • .elapsed() — returns elapsed milliseconds
    • .endWithContext(extraContext) — ends and logs with merged context
  • logger.setLevel(level) — validate and set a new minimum level

  • logger.getLevel() — read the current level

  • logger.isLevelEnabled(level) — true if logs at level would be emitted

  • await logger.flush() — flush plugins and transports

  • await logger.close() — flush, then close plugins and transports (parent only)

Advanced patterns

Testing

const logs = [];
const logger = createLogger({
  adapters: [
    {
      name: "test",
      handle: (record) => logs.push(record),
    },
  ],
});

expect(logs).toContainEqual(expect.objectContaining({ level: "error", msg: "payment failed" }));

Plugins

const auditPlugin = {
  name: "audit",
  order: 50,
  onInit(logger) { /* called once */ },
  onRecord(record) {
    // Return null to drop, return transformed record or undefined to keep
    if (record.context?.sensitive) return null;
    return record;
  },
  onFormat(record, formatted) { return formatted; },
  onWrite(record, formatted) { /* called after write */ },
  onFlush() { /* optional async */ },
  onClose() { /* optional async */ },
};

Adapters

const myAdapter = {
  name: "metrics",
  level: "info", // optional level threshold
  handle(record) {
    metricsClient.send(record);
  },
};

Troubleshooting

Logs not showing?

  • Confirm your logger level: logger.getLevel() isn't higher than the messages you expect.
  • Ensure transports and adapters are configured and not closed.
  • Call await logger.flush() before process exit.

Memory leaks?

  • Ensure you call await logger.close() on shutdown to close transports and plugins.
  • Avoid very large buffers in buffered transports; reduce bufferSize or flush interval if needed.

Logs too big?

  • Use redaction and plugins/adapters to truncate or transform large payloads before forwarding.

Production checklist

  • [ ] Enable redaction for PII
  • [ ] Configure batching/adapters for high-volume ingestion
  • [ ] Add correlation IDs for distributed traces
  • [ ] Test logger.flush() and logger.close() during graceful shutdown
  • [ ] Monitor memory if using buffered transports
  • [ ] Set appropriate sampling rates for verbose logs

Contributing

git clone https://github.com/yourusername/cenglu
cd cenglu
bun install
bun run test

License

MIT

Support

If this saves you time, consider:

  • ⭐ Starring the repo
  • 🐛 Reporting bugs
  • 💡 Suggesting features
  • 🍺 Buying me a beer

Built with ❤️ and ☕ by developers who were tired of bad loggers.