@emeryld/lllogger
v0.1.1
Published
Typed logger fanout with optional caller metadata, redaction, and batching.
Readme
@emeryld/lllogs
Typed logger fanout with optional caller metadata, redaction, and batching.
Install
npm i @emeryld/lllogs
Usage
import { createDocsLogger } from '@emeryld/lllogs';
type Defs = {
info: { msg: string };
error: { msg: string; err?: unknown };
};
const log = createDocsLogger<Defs>({
enabled: true,
callerSkip: ['node_modules', '/dist/'],
batching: { enabled: true, maxSize: 100, intervalMs: 200 },
loggers: {
info: {
enabled: true,
logger: (x) => console.log(x),
withCaller: (x, caller) => ({ ...x, caller })
},
error: {
enabled: true,
logger: (x) => console.error(x)
}
}
});
log.info({ msg: 'hello' });
log.flush();
log.shutdown();Sampling
Each logger can declare sampleRate as either a static number or a function (log) => number. The value is treated as a drop probability: 0 keeps every event, 1 drops them all, and intermediate values probabilistically drop that fraction before batching. Using the function lets you recompute the rate per log—for example to disable sampling in prod paths. Omitting sampleRate logs everything.
// drop most debug noise, but keep errors
{
sampleRate: (log) => (log.level === 'debug' ? 0.9 : 0)
}Notes on batching behavior
- Buffers are per logger key (e.g.,
infoanderrorbatch independently). - Flush triggers:
- buffer reaches
maxSize - interval timer (
intervalMs) - manual
flush() shutdown()stops the timer and flushes remaining logs
- buffer reaches
- If a flush throws while emitting, items are re-queued (best-effort) and
onFlushErroris called.
If you want a single global batch across all keys (instead of per-key), say so and I’ll adjust the queue structure accordingly.
