npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@bernierllc/logger

v1.0.3

Published

Structured logging with multiple transports, log levels, and correlation tracking for BernierLLC ecosystem

Readme

@bernierllc/logger

Production-ready structured logging with multiple transports, log levels, correlation tracking, and performance monitoring for the BernierLLC ecosystem.

Features

  • Structured Logging: JSON-based structured log output with metadata
  • Multiple Transports: Console, file, HTTP, database, and custom transports
  • Log Levels: Configurable log levels with filtering (ERROR, WARN, INFO, HTTP, VERBOSE, DEBUG, SILLY)
  • Correlation Tracking: Request/trace ID correlation across async operations
  • Performance Tracking: Duration tracking, timers, and profiling
  • Data Sanitization: Automatic PII protection and sensitive data masking
  • Context Management: Rich context and metadata support
  • Child Loggers: Contextual logger inheritance
  • Graceful Error Handling: Transport failure recovery and error callbacks

Installation

npm install @bernierllc/logger

Quick Start

import { Logger, LogLevel, ConsoleTransport, FileTransport } from '@bernierllc/logger';

const logger = new Logger({
  level: LogLevel.INFO,
  context: {
    service: 'user-service',
    version: '1.0.0',
    environment: process.env.NODE_ENV || 'development'
  },
  transports: [
    new ConsoleTransport({ level: LogLevel.DEBUG }),
    new FileTransport({ 
      filename: 'app.log',
      level: LogLevel.INFO 
    })
  ],
  sanitize: true,
  enableCorrelation: true
});

logger.info('Application started');
logger.error('Database connection failed', new Error('Connection timeout'));

Usage Examples

Basic Logging

import { createLogger } from '@bernierllc/logger';

const logger = createLogger({
  context: {
    service: 'my-service',
    version: '1.0.0',
    environment: 'production'
  }
});

logger.info('User logged in', { userId: '123', email: '[email protected]' });
logger.warn('High memory usage', { memoryUsage: '85%' });
logger.error('Payment failed', new Error('Invalid card'));

Performance Tracking

// Using timers
const timer = logger.startTimer('database-query');
await performDatabaseQuery();
timer.end('Query completed successfully');

// Using time wrapper
const result = await logger.time('cache-operation', async () => {
  return await cacheData(key, value);
});

// Using profiling
logger.profile('data-processing');
await processLargeDataset();
logger.profile('data-processing'); // Logs duration

Correlation Tracking

import { Correlation, CorrelationManager } from '@bernierllc/logger';

// Express middleware
app.use((req, res, next) => {
  const manager = CorrelationManager.getInstance();
  const context = manager.fromHeaders(req.headers);
  
  Correlation.run(context, () => {
    req.logger = logger.child({
      requestId: context.correlationId,
      method: req.method,
      url: req.url
    });
    next();
  });
});

// In route handlers
app.get('/users/:id', async (req, res) => {
  req.logger.info('Fetching user', { userId: req.params.id });
  // All logs will include correlation ID automatically
});

Child Loggers

const parentLogger = logger.child({
  component: 'auth-service'
});

const childLogger = parentLogger.child({
  subcomponent: 'token-validator'
});

childLogger.info('Token validated'); 
// Includes both component and subcomponent in context

Data Sanitization

const logger = new Logger({
  sanitize: true,
  sanitizeFields: ['password', 'token', 'creditCard', 'ssn'],
  transports: [new ConsoleTransport()]
});

logger.info('User data', {
  username: '[email protected]',
  password: 'secret123',    // Will be [REDACTED]
  age: 30                   // Will be preserved
});

Custom Transports

class SlackTransport implements LogTransport {
  name = 'slack';
  level = LogLevel.ERROR;

  async write(entry: LogEntry): Promise<void> {
    if (entry.level <= LogLevel.ERROR) {
      await this.sendToSlack(entry.message, entry.metadata);
    }
  }

  private async sendToSlack(message: string, metadata: any) {
    // Send to Slack webhook
  }
}

const logger = new Logger({
  transports: [
    new ConsoleTransport(),
    new SlackTransport()
  ]
});

Multiple Transports

const logger = new Logger({
  level: LogLevel.INFO,
  transports: [
    new ConsoleTransport({
      level: LogLevel.DEBUG,
      formatter: new TextFormatter()
    }),
    new FileTransport({
      filename: 'app.log',
      level: LogLevel.INFO,
      maxSize: 10 * 1024 * 1024, // 10MB
      maxFiles: 5
    }),
    new HTTPTransport({
      url: 'https://logs.example.com/api',
      level: LogLevel.WARN,
      headers: { 'Authorization': 'Bearer token' },
      batchSize: 100
    })
  ]
});

API Reference

Logger

Constructor

new Logger(options: LoggerOptions)

Methods

  • info(message: string, metadata?: object) - Log info message
  • warn(message: string, metadata?: object) - Log warning message
  • error(message: string, error?: Error, metadata?: object) - Log error message
  • debug(message: string, metadata?: object) - Log debug message
  • verbose(message: string, metadata?: object) - Log verbose message
  • silly(message: string, metadata?: object) - Log silly message
  • startTimer(label: string): LogTimer - Start performance timer
  • time(label: string, fn: () => Promise<T>): Promise<T> - Time async operation
  • profile(label: string) - Profile code section
  • child(context: Partial<LogContext>): Logger - Create child logger
  • setCorrelationId(id: string) - Set correlation ID
  • setUserId(id: string) - Set user ID
  • flush(): Promise<void> - Flush all transports
  • close(): Promise<void> - Close logger and transports

Log Levels

enum LogLevel {
  ERROR = 0,   // Highest priority
  WARN = 1,
  INFO = 2,
  HTTP = 3,
  VERBOSE = 4,
  DEBUG = 5,
  SILLY = 6    // Lowest priority
}

Transports

ConsoleTransport

Logs to console with appropriate console methods (error, warn, log, debug).

FileTransport

Logs to files with rotation support.

HTTPTransport

Sends logs to HTTP endpoints with batching and retry logic.

DatabaseTransport

Stores logs in databases with connection pooling.

Formatters

JSONFormatter

Outputs structured JSON logs.

TextFormatter

Outputs human-readable text logs.

StructuredFormatter

Outputs flattened structured logs with prefixed fields.

Runtime Configuration

The logger supports runtime configuration through config files, allowing you to customize logging behavior without modifying code.

Quick Setup

  1. Initialize configuration:

    npm run config:init
  2. Edit the generated logger.config.js:

    module.exports = {
      level: 'info',
      context: {
        service: 'my-service',
        version: '1.0.0'
      },
      console: {
        enabled: true,
        level: 'debug'
      }
    };
  3. Use in your code:

    import { createLoggerFromConfig } from '@bernierllc/logger';
       
    const logger = await createLoggerFromConfig();
    logger.info('Application started');

Configuration Discovery

The logger automatically discovers configuration files in this order:

  1. logger.config.js (JavaScript - recommended)
  2. logger.config.cjs (CommonJS)
  3. logger.config.mjs (ES modules)
  4. logger.config.json (JSON)
  5. package.json"logger" key

CLI Commands

# Initialize configuration
npm run config:init

# Validate configuration
npm run config:validate

# Print resolved configuration
npm run config:print

Environment Variables

All configuration options can be overridden with LOGGER_* environment variables:

LOGGER_LEVEL=debug
LOGGER_SERVICE=my-service
LOGGER_CONSOLE_LEVEL=info
LOGGER_FILE_ENABLED=true
LOGGER_FILE_FILENAME=logs/app.log

Complete Documentation

For detailed configuration options, examples, and migration guide, see: Logger Runtime Configuration Documentation

Configuration

LoggerOptions

interface LoggerOptions {
  level: LogLevel;                    // Minimum log level
  transports: LogTransport[];         // Array of transports
  context?: Partial<LogContext>;      // Default context
  enableCorrelation?: boolean;        // Enable correlation tracking
  sanitize?: boolean;                 // Enable data sanitization
  sanitizeFields?: string[];          // Fields to sanitize
  formatter?: LogFormatter;           // Default formatter
  onError?: (error: Error, transport: string) => void; // Error handler
}

LogContext

interface LogContext {
  service: string;        // Service name
  version: string;        // Service version
  environment: string;    // Environment (dev, staging, prod)
  hostname?: string;      // Server hostname
  pid?: number;           // Process ID
  thread?: string;        // Thread identifier
  [key: string]: any;     // Additional context fields
}

Error Handling

The logger includes comprehensive error handling:

const logger = new Logger({
  transports: [transport1, transport2],
  onError: (error, transportName) => {
    console.error(`Transport ${transportName} failed:`, error);
    // Handle transport failures (e.g., fallback logging)
  }
});

Transport failures are isolated and don't affect other transports or application execution.

Performance

The logger is optimized for production use:

  • Minimal logging overhead (<1ms per log entry)
  • Asynchronous transport writing
  • Efficient message and metadata size limiting
  • Smart batching for network transports
  • Memory-conscious correlation tracking

Best Practices

  1. Use appropriate log levels - Reserve ERROR for actual errors
  2. Include context - Add relevant metadata to log entries
  3. Use correlation IDs - Track requests across services
  4. Sanitize sensitive data - Prevent PII leakage
  5. Configure transports appropriately - Different levels for different outputs
  6. Use child loggers - Create context-specific loggers
  7. Handle transport failures - Implement error callbacks
  8. Monitor performance - Use built-in timing functions

Integration Status

Logger Integration

  • Status: N/A - This is the core logging package
  • Usage: Other BernierLLC packages integrate with this logger for consistent logging across the ecosystem

Docs-Suite Integration

  • Status: Ready - Full TypeDoc documentation export support
  • Format: TypeDoc with comprehensive API documentation
  • Features: Automatic documentation generation, code examples, and type definitions

NeverHub Integration

  • Status: Integrated - Full service discovery and event bus support
  • Implementation:
    • Auto-detects NeverHub availability at runtime
    • Graceful degradation when NeverHub is not available
    • Enhanced transport capabilities when NeverHub is present
    • Service registration with logging capabilities
    • Event-driven log aggregation and forwarding
  • Capabilities:
    • Dynamic configuration from NeverHub config service
    • Enhanced observability with distributed tracing
    • Automatic service health reporting via logs
    • Cross-service log correlation

Development

# Install dependencies
npm install

# Build the package
npm run build

# Run tests
npm test

# Run tests with coverage
npm run test:coverage

# Lint code
npm run lint

License

Copyright (c) 2025 Bernier LLC. All rights reserved.

Related Packages