@onezlinks/session-logger
v1.0.1
Published
Session-based file logger with AsyncLocalStorage context propagation for Node.js microservices
Maintainers
Readme
@onezlinks/session-logger
📁 Session-based file logger with AsyncLocalStorage context propagation for Node.js microservices.
📑 Table of Contents
- Features
- Installation
- Quick Start
- How It Works
- API Reference
- Log Output
- Configuration
- Log Rotation & Cleanup
- Docker
- TypeScript Support
- Testing
- License
✨ Features
| Feature | Description |
| ---------------------------- | -------------------------------------------------------------- |
| 🔄 Hybrid API | Wrapper (withSession) + Manual (startSession/endSession) |
| 🧵 AsyncLocalStorage | Implicit context propagation — no parameter drilling |
| 📝 Dual Output | Simultaneous file + stdout/stderr logging |
| 📂 Category-based | Logs organized into {category}/YYYY-MM-DD/ subdirectories |
| 🧹 Log Rotation | cleanOldLogs() with configurable retention period |
| 💰 Structured Formatters | Cost, duration, and custom metric formatters |
| 🛡️ Graceful Fallback | Falls back to console.* when called outside a session |
| 📦 Zero Dependencies | Uses only Node.js built-in modules |
| 🔤 TypeScript Support | Full .d.ts type definitions included |
📦 Installation
# Published package (after npm publish)
npm install @onezlinks/session-loggerFor local development (monorepo or pre-publish):
# Install from sibling directory
npm install ../session-logger
# Or use npm link
cd /path/to/session-logger && npm link
cd /path/to/your-service && npm link @onezlinks/session-logger🚀 Quick Start
Wrapper Pattern (Recommended)
withSession creates a session context, runs your function, and automatically writes a summary footer and closes file streams when done.
const { withSession, log, logCost } = require("@onezlinks/session-logger");
async function processOrder(order) {
await withSession(
{
sessionId: order._id.toString(),
category: "order-processing",
metadata: { orderId: order._id, store: order.storeId },
},
async () => {
log("Pipeline", "🚀 Processing started");
// ... your business logic ...
// All log/error/warn calls within this async scope
// automatically write to the session log file.
logCost("Billing", { totalUSD: 0.001, totalTHB: 0.035 }, 100, 50);
log("Pipeline", "✅ Processing complete");
},
);
}Manual Pattern
Use startSession / endSession when you need finer control over the session lifecycle (e.g., event handlers, iterative flows). You must call endSession() in a finally block — otherwise the file streams will leak.
const {
startSession,
endSession,
log,
error,
} = require("@onezlinks/session-logger");
async function handleWebhook(data) {
startSession({
sessionId: data.id,
category: "webhook",
});
try {
log("Webhook", "Received payload");
// ... processing ...
} catch (err) {
error("Webhook", `Failed: ${err.message}`);
throw err;
} finally {
await endSession(); // ⚠️ MUST be in finally block
}
}🔍 How It Works
┌───────────────────────────────────────────────────────────┐
│ withSession(options, asyncFn) │
│ ┌─────────────────────────────────────────────────────┐ │
│ │ 1. Create date folder logs/{category}/YYYY-MM-DD │ │
│ │ 2. Open write streams .log + .err.log │ │
│ │ 3. Write session header │ │
│ │ 4. Store context in AsyncLocalStorage │ │
│ │ 5. Execute asyncFn() │ │
│ │ ┌──────────────────────────────────────────┐ │ │
│ │ │ log() → .log file + stdout │ │ │
│ │ │ warn() → .log file + stdout │ │ │
│ │ │ error() → .err.log file + stderr │ │ │
│ │ └──────────────────────────────────────────┘ │ │
│ │ 6. Write session summary (always "SUCCESS") │ │
│ │ 7. Close file streams │ │
│ └─────────────────────────────────────────────────────┘ │
└───────────────────────────────────────────────────────────┘Key behaviors:
log()andwarn()write to the.logfile viastdoutstream ofconsole.Console.error()writes to a separate.err.logfile viastderrstream.- When called outside a session context, all functions fall back to the global
consoleobject (console.log,console.error,console.warn). - The session summary always records status
SUCCESS. Error tracking is available throughstats.errorscount.
📖 API Reference
Session Management
withSession(options, asyncFn) → Promise<T>
Run an async function within a session context. The session is automatically finalized in a finally block, ensuring streams are always closed — even if asyncFn throws.
| Parameter | Type | Required | Default | Description |
| ---------------------- | ------------------ | -------- | ---------------------- | -------------------------------------------- |
| options.sessionId | string | ✅ | — | Unique session identifier |
| options.category | string | ✅ | — | Log category — used as subdirectory name |
| options.metadata | object | ❌ | {} | Key-value pairs added to the session header |
| options.logDir | string | ❌ | CONFIG.LOG_BASE_DIR | Override base log directory |
| options.enableFile | boolean | ❌ | CONFIG.ENABLE_FILE | Enable/disable file logging |
| options.enableStdout | boolean | ❌ | CONFIG.ENABLE_STDOUT | Enable/disable stdout/stderr output |
| asyncFn | () => Promise<T> | ✅ | — | Async function to execute within the session |
Returns: The return value of asyncFn.
const result = await withSession(
{ sessionId: "abc123", category: "ai-pipeline" },
async () => {
log("Pipeline", "Starting...");
return { success: true };
},
);
// result === { success: true }startSession(options) → void
Start a session context using the manual pattern. Accepts the same options as withSession (without asyncFn).
⚠️ You must call
endSession()in afinallyblock. Failing to do so will leave file streams open (resource leak).
endSession() → Promise<void>
End the current session — writes the summary footer and closes file streams. If called outside a session context, this is a no-op.
getSessionContext() → SessionContext | null
Returns the current session context object, or null if not inside a session. Useful for conditional logic based on session state.
const ctx = getSessionContext();
if (ctx) {
console.log(`Session ${ctx.sessionId} has ${ctx.stats.errors} errors`);
}SessionContext properties:
| Property | Type | Description |
| ------------ | ---------------- | ---------------------------------------------------- |
| sessionId | string | The session identifier |
| category | string | The log category |
| metadata | object | Metadata from options |
| logPath | string \| null | Absolute path to the .log file |
| errLogPath | string \| null | Absolute path to the .err.log file |
| startTime | number | Date.now() when the session started |
| stats | object | { logs: number, errors: number, warnings: number } |
Logging Functions
All logging functions follow this signature:
functionName(tag, message, ...args);tag— A short label for the source module/component (padded to 12 characters in output).message— The log message string....args— Additional values (objects, arrays, etc.) appended after the message.
Output behavior:
| Context | enableStdout | File output | Console output |
| --------------- | -------------- | ----------- | ---------------------------- |
| Inside session | true | ✅ Writes | ✅ Writes |
| Inside session | false | ✅ Writes | ❌ Suppressed |
| Outside session | — | ❌ No file | ✅ Falls back to console.* |
Output format:
[HH:mm:ss.SSS] [Tag ] messagelog(tag, message, ...args)
Log an info-level message. Writes to the .log file and console.log.
log("Pipeline", "🚀 Processing started");
log("OCR", "Result:", { confidence: 0.95 });info(tag, message, ...args)
Alias for log(). Use whichever reads better in your code.
warn(tag, message, ...args)
Log a warning message. Writes to the .log file (same as log) and console.warn. Increments stats.warnings.
error(tag, message, ...args)
Log an error message. Writes to a separate .err.log file (not the main .log) and console.error. Increments stats.errors.
Formatters
Convenience functions that format data into a consistent string and log it via log().
logCost(tag, costData, inputUnits?, outputUnits?, cachedUnits?)
Log cost/billing data.
| Parameter | Type | Default | Description |
| ------------- | -------- | ------- | ---------------------------------------- |
| tag | string | — | Module/component tag |
| costData | object | — | { totalUSD: number, totalTHB: number } |
| inputUnits | number | 0 | Input units (tokens, API calls) |
| outputUnits | number | 0 | Output units |
| cachedUnits | number | 0 | Cached units |
logCost("Quality", { totalUSD: 0.001, totalTHB: 0.035 }, 100, 50);
// → [12:00:00.500] [Quality ] 💰 Cost: $0.001000 (~฿0.0350) | Units: 100+50
logCost("Quality", { totalUSD: 0.002, totalTHB: 0.07 }, 200, 80, 50);
// → [12:00:00.500] [Quality ] 💰 Cost: $0.002000 (~฿0.0700) | Units: 200+80 (cached: 50)logDuration(tag, durationMs, label?)
Log a duration metric.
| Parameter | Type | Default | Description |
| ------------ | -------- | ------------ | ------------------------ |
| tag | string | — | Module/component tag |
| durationMs | number | — | Duration in milliseconds |
| label | string | 'Duration' | Metric label |
logDuration("Pipeline", 500);
// → [12:00:01.200] [Pipeline ] ⏱️ Duration: 500ms
logDuration("Pipeline", 2500);
// → [12:00:01.200] [Pipeline ] ⏱️ Duration: 2500ms (2.50s)
logDuration("OCR", 3200, "OCR Time");
// → [12:00:01.200] [OCR ] ⏱️ OCR Time: 3200ms (3.20s)Note: The seconds conversion
(X.XXs)only appears when the duration is ≥ 1000ms.
logMetric(tag, metricName, value, unit?)
Log a custom metric.
logMetric("OCR", "Confidence", 95.5, "%");
// → [12:00:01.200] [OCR ] Confidence: 95.5 %
logMetric("Pipeline", "Items Processed", 42);
// → [12:00:01.200] [Pipeline ] Items Processed: 42Utilities
cleanOldLogs(category?, daysToKeep?) → Promise<CleanupResult>
Remove date-based log folders older than the retention period. Scans {LOG_BASE_DIR}/{category}/YYYY-MM-DD/ and removes folders where the date is older than daysToKeep.
| Parameter | Type | Default | Description |
| ------------ | -------- | --------------------------- | -------------------------- |
| category | string | all categories | Specific category to clean |
| daysToKeep | number | CONFIG.LOG_RETENTION_DAYS | Number of days to retain |
Returns: { deleted: string[], errors: string[] }
const { cleanOldLogs } = require("@onezlinks/session-logger");
// Clean all categories using default retention (30 days)
const result = await cleanOldLogs();
console.log(`Deleted ${result.deleted.length} folders`);
// deleted: ['ai-pipeline/2026-01-01', 'webhook/2026-01-02', ...]
// Clean a specific category, keep only last 7 days
const result = await cleanOldLogs("ai-pipeline", 7);If the base log directory does not exist, returns { deleted: [], errors: [] } without throwing.
Constants
CONFIG
Configuration object with lazy evaluation — environment variables are read at access time (not at import time), making them testable and overridable at runtime.
| Property | Env Variable | Default | Description |
| -------------------- | ---------------------------- | -------- | ------------------------------------------------ |
| LOG_BASE_DIR | SESSION_LOG_DIR | 'logs' | Base directory for log files (relative to cwd) |
| LOG_RETENTION_DAYS | SESSION_LOG_RETENTION_DAYS | 30 | Days to retain logs before cleanup |
| ENABLE_FILE | SESSION_LOG_TO_FILE | true | Set to 'false' to disable file logging |
| ENABLE_STDOUT | SESSION_LOG_TO_STDOUT | true | Set to 'false' to disable stdout/stderr output |
Note:
ENABLE_FILEandENABLE_STDOUTare only disabled when the env var is exactly'false'(string). Any other value (including unset) keeps them enabled.
COMMON_TAGS
Predefined tag string constants for consistency across services. These are suggestions — you can use any string as a tag.
| Group | Tags |
| --------------- | ---------------------------------------------------- |
| AI Pipeline | PIPELINE, QUALITY, OCR, OPENAI, CLIENT |
| Rewards | REDEMPTION, POINTS, VALIDATION, NOTIFICATION |
| Survey | SURVEY, STORAGE, ANALYTICS |
| Order | ORDER, PAYMENT, INVENTORY, SHIPPING |
| Messaging | MESSAGE, WEBHOOK, RESPONSE, BROADCAST |
| Generic | DATABASE, CACHE, WORKER |
const { COMMON_TAGS, log } = require("@onezlinks/session-logger");
log(COMMON_TAGS.PIPELINE, "Processing started"); // tag = 'Pipeline'
log(COMMON_TAGS.OCR, "Recognition complete"); // tag = 'OCR'
log(COMMON_TAGS.DATABASE, "Query executed"); // tag = 'Database'📂 Log Output
Directory Structure
logs/ ← CONFIG.LOG_BASE_DIR
├── ai-pipeline/ ← category
│ └── 2026-02-07/ ← date folder (YYYY-MM-DD)
│ ├── 6789abc_20260207T120000_001.log
│ └── 6789abc_20260207T120000_001.err.log
└── order-processing/
└── 2026-02-07/
├── def5678_20260207T130000_042.log
└── def5678_20260207T130000_042.err.logFile Naming Convention
Each session creates a unique filename:
{sessionId7}_{timestamp}_{random}.log| Component | Description | Example |
| ------------ | ---------------------------------------------- | ----------------- |
| sessionId7 | First 7 characters of sessionId | 6789abc |
| timestamp | ISO 8601 compact format (no dashes, no colons) | 20260207T120000 |
| random | 3-digit random number (avoids collisions) | 001 |
This produces filenames like 6789abc_20260207T120000_001.log and a matching .err.log.
Log File Content
═══════════════════════════════════════════════════════════════
Ai Pipeline Session
═══════════════════════════════════════════════════════════════
Session ID : 6789abc...
Category : ai-pipeline
Order Id : order-001
Started : 2026-02-07T12:00:00.000Z
═══════════════════════════════════════════════════════════════
[12:00:00.001] [Pipeline ] 🚀 Processing started
[12:00:00.500] [Quality ] 💰 Cost: $0.001000 (~฿0.0350) | Units: 100+50
[12:00:01.200] [Pipeline ] ✅ Processing complete
═══════════════════════════════════════════════════════════════
Session Summary
═══════════════════════════════════════════════════════════════
Status : SUCCESS
Duration : 1200ms (1.20s)
Log Stats : 3 info, 0 errors, 0 warnings
═══════════════════════════════════════════════════════════════Notes:
- The session title is derived from the
categoryby convertingkebab-casetoTitle Case(e.g.,ai-pipeline→Ai Pipeline Session). - Metadata keys are formatted from
camelCasetoTitle Case(e.g.,orderId→Order Id). - The summary status is always
SUCCESS. To detect errors, check thestats.errorscount viagetSessionContext()or inspect the.err.logfile.
⚙️ Configuration
All configuration is through environment variables. No config files needed.
# Change log directory (default: 'logs')
SESSION_LOG_DIR=./output/logs
# Change retention period (default: 30 days)
SESSION_LOG_RETENTION_DAYS=14
# Disable file logging (only stdout)
SESSION_LOG_TO_FILE=false
# Disable stdout (only file logging — useful in production)
SESSION_LOG_TO_STDOUT=falseYou can also override per-session:
await withSession(
{
sessionId: "test-123",
category: "debug",
logDir: "./debug-logs", // Override LOG_BASE_DIR for this session
enableFile: false, // Skip file logging for this session
enableStdout: true, // Still print to console
},
async () => {
log("Debug", "This only prints to console, no file created");
},
);🧹 Log Rotation & Cleanup
The library provides a cleanOldLogs() function that removes date folders older than the retention period. Scheduling is your responsibility — the library intentionally does not include a built-in scheduler to remain zero-dependency.
Startup Cleanup (Simplest)
const { cleanOldLogs } = require("@onezlinks/session-logger");
// Run once at application startup
cleanOldLogs().then(({ deleted, errors }) => {
if (deleted.length)
console.log(`[Cleanup] Removed ${deleted.length} old log folders`);
if (errors.length) console.error("[Cleanup] Errors:", errors);
});Scheduled Cleanup with node-cron
npm install node-cronconst cron = require("node-cron");
const { cleanOldLogs } = require("@onezlinks/session-logger");
// Run daily at 3:00 AM
cron.schedule("0 3 * * *", async () => {
const { deleted, errors } = await cleanOldLogs();
if (deleted.length)
console.log(`[Cron] Deleted ${deleted.length} old log folders`);
if (errors.length) console.error("[Cron] Cleanup errors:", errors);
});Scheduled Cleanup with setInterval (No Extra Dependencies)
const { cleanOldLogs } = require("@onezlinks/session-logger");
// Run every 24 hours
setInterval(
async () => {
const { deleted } = await cleanOldLogs();
if (deleted.length)
console.log(`[Cleanup] Deleted ${deleted.length} folders`);
},
24 * 60 * 60 * 1000,
);Cluster-Safe Pattern (PM2 / Docker Swarm)
When running multiple instances, schedule cleanup on one instance only:
const cluster = require("cluster");
const cron = require("node-cron");
const { cleanOldLogs } = require("@onezlinks/session-logger");
if (cluster.isPrimary || process.env.NODE_APP_INSTANCE === "0") {
cron.schedule("0 3 * * *", () => cleanOldLogs());
}Cron Schedule Reference
| Pattern | Description |
| ------------- | ----------------------------- |
| 0 3 * * * | Every day at 3:00 AM |
| 0 */6 * * * | Every 6 hours |
| 0 0 * * 0 | Every Sunday at midnight |
| 0 2 1 * * | First day of month at 2:00 AM |
🐳 Docker
Add a volume mount to persist logs outside the container:
services:
your-service:
volumes:
- ./logs:/app/logs
environment:
- SESSION_LOG_DIR=logs # relative to /app (working dir)
- SESSION_LOG_RETENTION_DAYS=14Ensure the
SESSION_LOG_DIRpath matches the volume mount target. The default valuelogsworks with/app/logswhen the working directory is/app.
🔤 TypeScript Support
Full type definitions are included at types/index.d.ts. No additional @types/ package is needed.
import {
withSession,
log,
error,
logCost,
cleanOldLogs,
CONFIG,
COMMON_TAGS,
type SessionOptions,
type SessionContext,
type CostData,
type CleanupResult,
} from "@onezlinks/session-logger";🧪 Testing
npm test # Run all tests with coverage
npm run test:watch # Watch mode for development
npm run lint # ESLint check
npm run lint:fix # ESLint auto-fix📄 License
MIT © Onez (onezlinks)
