@bernierllc/logger
v1.0.3
Published
Structured logging with multiple transports, log levels, and correlation tracking for BernierLLC ecosystem
Readme
@bernierllc/logger
Production-ready structured logging with multiple transports, log levels, correlation tracking, and performance monitoring for the BernierLLC ecosystem.
Features
- Structured Logging: JSON-based structured log output with metadata
- Multiple Transports: Console, file, HTTP, database, and custom transports
- Log Levels: Configurable log levels with filtering (ERROR, WARN, INFO, HTTP, VERBOSE, DEBUG, SILLY)
- Correlation Tracking: Request/trace ID correlation across async operations
- Performance Tracking: Duration tracking, timers, and profiling
- Data Sanitization: Automatic PII protection and sensitive data masking
- Context Management: Rich context and metadata support
- Child Loggers: Contextual logger inheritance
- Graceful Error Handling: Transport failure recovery and error callbacks
Installation
npm install @bernierllc/loggerQuick Start
import { Logger, LogLevel, ConsoleTransport, FileTransport } from '@bernierllc/logger';
const logger = new Logger({
level: LogLevel.INFO,
context: {
service: 'user-service',
version: '1.0.0',
environment: process.env.NODE_ENV || 'development'
},
transports: [
new ConsoleTransport({ level: LogLevel.DEBUG }),
new FileTransport({
filename: 'app.log',
level: LogLevel.INFO
})
],
sanitize: true,
enableCorrelation: true
});
logger.info('Application started');
logger.error('Database connection failed', new Error('Connection timeout'));Usage Examples
Basic Logging
import { createLogger } from '@bernierllc/logger';
const logger = createLogger({
context: {
service: 'my-service',
version: '1.0.0',
environment: 'production'
}
});
logger.info('User logged in', { userId: '123', email: '[email protected]' });
logger.warn('High memory usage', { memoryUsage: '85%' });
logger.error('Payment failed', new Error('Invalid card'));Performance Tracking
// Using timers
const timer = logger.startTimer('database-query');
await performDatabaseQuery();
timer.end('Query completed successfully');
// Using time wrapper
const result = await logger.time('cache-operation', async () => {
return await cacheData(key, value);
});
// Using profiling
logger.profile('data-processing');
await processLargeDataset();
logger.profile('data-processing'); // Logs durationCorrelation Tracking
import { Correlation, CorrelationManager } from '@bernierllc/logger';
// Express middleware
app.use((req, res, next) => {
const manager = CorrelationManager.getInstance();
const context = manager.fromHeaders(req.headers);
Correlation.run(context, () => {
req.logger = logger.child({
requestId: context.correlationId,
method: req.method,
url: req.url
});
next();
});
});
// In route handlers
app.get('/users/:id', async (req, res) => {
req.logger.info('Fetching user', { userId: req.params.id });
// All logs will include correlation ID automatically
});Child Loggers
const parentLogger = logger.child({
component: 'auth-service'
});
const childLogger = parentLogger.child({
subcomponent: 'token-validator'
});
childLogger.info('Token validated');
// Includes both component and subcomponent in contextData Sanitization
const logger = new Logger({
sanitize: true,
sanitizeFields: ['password', 'token', 'creditCard', 'ssn'],
transports: [new ConsoleTransport()]
});
logger.info('User data', {
username: '[email protected]',
password: 'secret123', // Will be [REDACTED]
age: 30 // Will be preserved
});Custom Transports
class SlackTransport implements LogTransport {
name = 'slack';
level = LogLevel.ERROR;
async write(entry: LogEntry): Promise<void> {
if (entry.level <= LogLevel.ERROR) {
await this.sendToSlack(entry.message, entry.metadata);
}
}
private async sendToSlack(message: string, metadata: any) {
// Send to Slack webhook
}
}
const logger = new Logger({
transports: [
new ConsoleTransport(),
new SlackTransport()
]
});Multiple Transports
const logger = new Logger({
level: LogLevel.INFO,
transports: [
new ConsoleTransport({
level: LogLevel.DEBUG,
formatter: new TextFormatter()
}),
new FileTransport({
filename: 'app.log',
level: LogLevel.INFO,
maxSize: 10 * 1024 * 1024, // 10MB
maxFiles: 5
}),
new HTTPTransport({
url: 'https://logs.example.com/api',
level: LogLevel.WARN,
headers: { 'Authorization': 'Bearer token' },
batchSize: 100
})
]
});API Reference
Logger
Constructor
new Logger(options: LoggerOptions)Methods
info(message: string, metadata?: object)- Log info messagewarn(message: string, metadata?: object)- Log warning messageerror(message: string, error?: Error, metadata?: object)- Log error messagedebug(message: string, metadata?: object)- Log debug messageverbose(message: string, metadata?: object)- Log verbose messagesilly(message: string, metadata?: object)- Log silly messagestartTimer(label: string): LogTimer- Start performance timertime(label: string, fn: () => Promise<T>): Promise<T>- Time async operationprofile(label: string)- Profile code sectionchild(context: Partial<LogContext>): Logger- Create child loggersetCorrelationId(id: string)- Set correlation IDsetUserId(id: string)- Set user IDflush(): Promise<void>- Flush all transportsclose(): Promise<void>- Close logger and transports
Log Levels
enum LogLevel {
ERROR = 0, // Highest priority
WARN = 1,
INFO = 2,
HTTP = 3,
VERBOSE = 4,
DEBUG = 5,
SILLY = 6 // Lowest priority
}Transports
ConsoleTransport
Logs to console with appropriate console methods (error, warn, log, debug).
FileTransport
Logs to files with rotation support.
HTTPTransport
Sends logs to HTTP endpoints with batching and retry logic.
DatabaseTransport
Stores logs in databases with connection pooling.
Formatters
JSONFormatter
Outputs structured JSON logs.
TextFormatter
Outputs human-readable text logs.
StructuredFormatter
Outputs flattened structured logs with prefixed fields.
Runtime Configuration
The logger supports runtime configuration through config files, allowing you to customize logging behavior without modifying code.
Quick Setup
Initialize configuration:
npm run config:initEdit the generated
logger.config.js:module.exports = { level: 'info', context: { service: 'my-service', version: '1.0.0' }, console: { enabled: true, level: 'debug' } };Use in your code:
import { createLoggerFromConfig } from '@bernierllc/logger'; const logger = await createLoggerFromConfig(); logger.info('Application started');
Configuration Discovery
The logger automatically discovers configuration files in this order:
logger.config.js(JavaScript - recommended)logger.config.cjs(CommonJS)logger.config.mjs(ES modules)logger.config.json(JSON)package.json→"logger"key
CLI Commands
# Initialize configuration
npm run config:init
# Validate configuration
npm run config:validate
# Print resolved configuration
npm run config:printEnvironment Variables
All configuration options can be overridden with LOGGER_* environment variables:
LOGGER_LEVEL=debug
LOGGER_SERVICE=my-service
LOGGER_CONSOLE_LEVEL=info
LOGGER_FILE_ENABLED=true
LOGGER_FILE_FILENAME=logs/app.logComplete Documentation
For detailed configuration options, examples, and migration guide, see: Logger Runtime Configuration Documentation
Configuration
LoggerOptions
interface LoggerOptions {
level: LogLevel; // Minimum log level
transports: LogTransport[]; // Array of transports
context?: Partial<LogContext>; // Default context
enableCorrelation?: boolean; // Enable correlation tracking
sanitize?: boolean; // Enable data sanitization
sanitizeFields?: string[]; // Fields to sanitize
formatter?: LogFormatter; // Default formatter
onError?: (error: Error, transport: string) => void; // Error handler
}LogContext
interface LogContext {
service: string; // Service name
version: string; // Service version
environment: string; // Environment (dev, staging, prod)
hostname?: string; // Server hostname
pid?: number; // Process ID
thread?: string; // Thread identifier
[key: string]: any; // Additional context fields
}Error Handling
The logger includes comprehensive error handling:
const logger = new Logger({
transports: [transport1, transport2],
onError: (error, transportName) => {
console.error(`Transport ${transportName} failed:`, error);
// Handle transport failures (e.g., fallback logging)
}
});Transport failures are isolated and don't affect other transports or application execution.
Performance
The logger is optimized for production use:
- Minimal logging overhead (<1ms per log entry)
- Asynchronous transport writing
- Efficient message and metadata size limiting
- Smart batching for network transports
- Memory-conscious correlation tracking
Best Practices
- Use appropriate log levels - Reserve ERROR for actual errors
- Include context - Add relevant metadata to log entries
- Use correlation IDs - Track requests across services
- Sanitize sensitive data - Prevent PII leakage
- Configure transports appropriately - Different levels for different outputs
- Use child loggers - Create context-specific loggers
- Handle transport failures - Implement error callbacks
- Monitor performance - Use built-in timing functions
Integration Status
Logger Integration
- Status: N/A - This is the core logging package
- Usage: Other BernierLLC packages integrate with this logger for consistent logging across the ecosystem
Docs-Suite Integration
- Status: Ready - Full TypeDoc documentation export support
- Format: TypeDoc with comprehensive API documentation
- Features: Automatic documentation generation, code examples, and type definitions
NeverHub Integration
- Status: Integrated - Full service discovery and event bus support
- Implementation:
- Auto-detects NeverHub availability at runtime
- Graceful degradation when NeverHub is not available
- Enhanced transport capabilities when NeverHub is present
- Service registration with logging capabilities
- Event-driven log aggregation and forwarding
- Capabilities:
- Dynamic configuration from NeverHub config service
- Enhanced observability with distributed tracing
- Automatic service health reporting via logs
- Cross-service log correlation
Development
# Install dependencies
npm install
# Build the package
npm run build
# Run tests
npm test
# Run tests with coverage
npm run test:coverage
# Lint code
npm run lintLicense
Copyright (c) 2025 Bernier LLC. All rights reserved.
Related Packages
- @bernierllc/retry-policy - Used for transport retry logic
- @bernierllc/csv-parser - May use this logger for debugging
- @bernierllc/email-sender - May use this logger for operation tracking
