@bernierllc/logging-utils
v0.3.2
Published
Utility functions for log analysis, performance monitoring, and format conversion
Readme
@bernierllc/logging-utils
Utility functions for log analysis, performance monitoring, and format conversion. This package provides advanced utilities for working with logs beyond basic logging functionality.
Features
- Performance Monitoring: Timer tracking, metrics collection, and memory usage monitoring
- Log Analysis: Error frequency analysis, performance statistics, and user activity tracking
- Log Aggregation: Batching, deduplication, and rate limiting utilities
- Format Conversion: Convert between JSON, text, CSV, syslog, and logfmt formats
- Type Safety: Full TypeScript support with comprehensive type definitions
- Zero Dependencies: Only depends on @bernierllc/logging-core
Installation
npm install @bernierllc/logging-utils @bernierllc/logging-coreUsage
Performance Monitoring
import { PerformanceLogger, MemoryTracker } from '@bernierllc/logging-utils';
// Performance tracking
const perfLogger = new PerformanceLogger();
perfLogger.startTimer('database-query');
// ... perform database query
const duration = perfLogger.endTimer('database-query');
// Increment counters
perfLogger.incrementCounter('api-requests');
perfLogger.incrementCounter('errors', 1);
// Get metrics
const metrics = perfLogger.getMetrics('database-query');
console.log(`Average duration: ${metrics?.duration}ms`);
// Memory tracking
const memoryTracker = new MemoryTracker();
memoryTracker.takeSnapshot('before-operation');
// ... perform operation
memoryTracker.takeSnapshot('after-operation');
const diff = memoryTracker.compareSnapshots('before-operation', 'after-operation');
console.log(`Memory increase: ${diff.heapUsedDiff} bytes`);Log Analysis
import { LogAnalyzer, AuditLogAnalyzer } from '@bernierllc/logging-utils';
import { LogEntry, AuditLogEntry } from '@bernierllc/logging-core';
// System log analysis
const analyzer = new LogAnalyzer();
// Add logs
analyzer.addLogs(logEntries);
// Get error analysis
const errorAnalysis = analyzer.getErrorAnalysis();
console.log('Most common errors:', errorAnalysis);
// Get performance stats
const stats = analyzer.getPerformanceStats();
console.log(`Total logs: ${stats.totalLogs}, Errors: ${stats.errors}`);
// Get logs by time range
const recentLogs = analyzer.getLogsByTimeRange(
'2025-01-01T00:00:00Z',
'2025-01-01T23:59:59Z'
);
// Audit log analysis
const auditAnalyzer = new AuditLogAnalyzer();
auditAnalyzer.addAuditLogs(auditLogEntries);
// Get suspicious activity
const suspicious = auditAnalyzer.getSuspiciousActivity(10);
console.log('Suspicious activity:', suspicious);
// Get user activity summary
const userActivity = auditAnalyzer.getUserActivitySummary();
console.log('User activity:', userActivity);Log Aggregation
import { LogAggregator, LogDeduplicator, LogRateLimiter } from '@bernierllc/logging-utils';
// Batch processing
const aggregator = new LogAggregator(100, 5000); // 100 logs, 5 second flush
aggregator.setFlushCallback((logs) => {
// Process batch of logs
console.log(`Processing ${logs.length} logs`);
});
aggregator.addLog(logEntry);
// Deduplication
const deduplicator = new LogDeduplicator(60000); // 1 minute window
if (!deduplicator.isDuplicate(logEntry)) {
// Process unique log
console.log('Processing unique log');
}
// Rate limiting
const rateLimiter = new LogRateLimiter(1000); // 1000 logs per minute
if (rateLimiter.shouldAllowLog('error')) {
// Log allowed
console.log('Logging error');
} else {
console.log('Rate limit exceeded');
}Format Conversion
import { LogFormatter, LogParser } from '@bernierllc/logging-utils';
import { LogEntry } from '@bernierllc/logging-core';
// Convert to different formats
const logEntry: LogEntry = {
level: 'info',
message: 'Application started',
timestamp: new Date().toISOString(),
metadata: { userId: 'user123' }
};
// Convert to different formats
const json = LogFormatter.toJSON(logEntry);
const text = LogFormatter.toText(logEntry);
const csv = LogFormatter.toCSV(logEntry);
const syslog = LogFormatter.toSyslog(logEntry);
const logfmt = LogFormatter.toLogfmt(logEntry);
// Parse from different formats
const parsedFromJson = LogParser.fromJSON(json);
const parsedFromText = LogParser.fromText(text);
const parsedFromCsv = LogParser.fromCSV(csv);
// Convert multiple logs
const logs: LogEntry[] = [logEntry1, logEntry2, logEntry3];
const jsonArray = LogFormatter.formatMultiple(logs, 'json');
const csvTable = LogFormatter.formatMultiple(logs, 'csv');API Reference
Performance Monitoring
PerformanceLogger
startTimer(name: string): voidendTimer(name: string, metadata?: Record<string, unknown>): numberincrementCounter(name: string, value?: number): voidgetMetrics(name: string): PerformanceMetrics | undefinedgetAllMetrics(): Map<string, PerformanceMetrics>reset(): void
MemoryTracker
takeSnapshot(name: string): NodeJS.MemoryUsagegetSnapshot(name: string): NodeJS.MemoryUsage | undefinedcompareSnapshots(name1: string, name2: string): Record<string, number>getMemoryUsagePercentage(): number
Log Analysis
LogAnalyzer
addLog(log: LogEntry): voidaddLogs(logs: LogEntry[]): voidgetLogsByLevel(level: LogLevel): LogEntry[]getLogsByTimeRange(startTime: string, endTime: string): LogEntry[]getLogsByText(searchText: string): LogEntry[]getErrorAnalysis(): Record<string, number>getPerformanceStats(): PerformanceStatsgetUniqueUsers(): string[]getUniqueRequestIds(): string[]
AuditLogAnalyzer
addAuditLog(log: AuditLogEntry): voidgetAuditLogsByActor(actorId: string): AuditLogEntry[]getAuditLogsByAction(action: string): AuditLogEntry[]getAuditLogsByTarget(target: string): AuditLogEntry[]getSuspiciousActivity(threshold?: number): Record<string, number>getUserActivitySummary(): Record<string, UserActivity>
Log Aggregation
LogAggregator
addLog(log: LogEntry): voidaddLogs(logs: LogEntry[]): voidflush(): LogEntry[]setFlushCallback(callback: (logs: LogEntry[]) => void): voidstop(): voidgetBatchSize(): number
LogDeduplicator
isDuplicate(log: LogEntry): booleanclear(): void
LogRateLimiter
shouldAllowLog(level: LogLevel): booleangetLogCounts(): Record<string, number>
Format Conversion
LogFormatter
toJSON(log: LogEntry): stringtoText(log: LogEntry): stringtoCSV(log: LogEntry): stringtoSyslog(log: LogEntry): stringtoLogfmt(log: LogEntry): stringformatMultiple(logs: LogEntry[], format: Format): string
LogParser
fromJSON(json: string): LogEntryfromText(text: string): LogEntryfromCSV(csv: string): LogEntryparseMultiple(content: string, format: Format): LogEntry[]
Architecture
This package extends the logging ecosystem:
@bernierllc/logging-core (core package)
├── Types and interfaces
└── Environment detection
@bernierllc/logging (service package)
├── Winston integration
└── File logging with rotation
@bernierllc/logging-utils (this package)
├── Performance monitoring
├── Log analysis tools
├── Aggregation utilities
└── Format convertersContributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
License
MIT License - see LICENSE file for details.
