npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@bernierllc/logging-utils

v0.3.2

Published

Utility functions for log analysis, performance monitoring, and format conversion

Readme

@bernierllc/logging-utils

Utility functions for log analysis, performance monitoring, and format conversion. This package provides advanced utilities for working with logs beyond basic logging functionality.

Features

  • Performance Monitoring: Timer tracking, metrics collection, and memory usage monitoring
  • Log Analysis: Error frequency analysis, performance statistics, and user activity tracking
  • Log Aggregation: Batching, deduplication, and rate limiting utilities
  • Format Conversion: Convert between JSON, text, CSV, syslog, and logfmt formats
  • Type Safety: Full TypeScript support with comprehensive type definitions
  • Zero Dependencies: Only depends on @bernierllc/logging-core

Installation

npm install @bernierllc/logging-utils @bernierllc/logging-core

Usage

Performance Monitoring

import { PerformanceLogger, MemoryTracker } from '@bernierllc/logging-utils';

// Performance tracking
const perfLogger = new PerformanceLogger();

perfLogger.startTimer('database-query');
// ... perform database query
const duration = perfLogger.endTimer('database-query');

// Increment counters
perfLogger.incrementCounter('api-requests');
perfLogger.incrementCounter('errors', 1);

// Get metrics
const metrics = perfLogger.getMetrics('database-query');
console.log(`Average duration: ${metrics?.duration}ms`);

// Memory tracking
const memoryTracker = new MemoryTracker();
memoryTracker.takeSnapshot('before-operation');
// ... perform operation
memoryTracker.takeSnapshot('after-operation');

const diff = memoryTracker.compareSnapshots('before-operation', 'after-operation');
console.log(`Memory increase: ${diff.heapUsedDiff} bytes`);

Log Analysis

import { LogAnalyzer, AuditLogAnalyzer } from '@bernierllc/logging-utils';
import { LogEntry, AuditLogEntry } from '@bernierllc/logging-core';

// System log analysis
const analyzer = new LogAnalyzer();

// Add logs
analyzer.addLogs(logEntries);

// Get error analysis
const errorAnalysis = analyzer.getErrorAnalysis();
console.log('Most common errors:', errorAnalysis);

// Get performance stats
const stats = analyzer.getPerformanceStats();
console.log(`Total logs: ${stats.totalLogs}, Errors: ${stats.errors}`);

// Get logs by time range
const recentLogs = analyzer.getLogsByTimeRange(
  '2025-01-01T00:00:00Z',
  '2025-01-01T23:59:59Z'
);

// Audit log analysis
const auditAnalyzer = new AuditLogAnalyzer();
auditAnalyzer.addAuditLogs(auditLogEntries);

// Get suspicious activity
const suspicious = auditAnalyzer.getSuspiciousActivity(10);
console.log('Suspicious activity:', suspicious);

// Get user activity summary
const userActivity = auditAnalyzer.getUserActivitySummary();
console.log('User activity:', userActivity);

Log Aggregation

import { LogAggregator, LogDeduplicator, LogRateLimiter } from '@bernierllc/logging-utils';

// Batch processing
const aggregator = new LogAggregator(100, 5000); // 100 logs, 5 second flush

aggregator.setFlushCallback((logs) => {
  // Process batch of logs
  console.log(`Processing ${logs.length} logs`);
});

aggregator.addLog(logEntry);

// Deduplication
const deduplicator = new LogDeduplicator(60000); // 1 minute window

if (!deduplicator.isDuplicate(logEntry)) {
  // Process unique log
  console.log('Processing unique log');
}

// Rate limiting
const rateLimiter = new LogRateLimiter(1000); // 1000 logs per minute

if (rateLimiter.shouldAllowLog('error')) {
  // Log allowed
  console.log('Logging error');
} else {
  console.log('Rate limit exceeded');
}

Format Conversion

import { LogFormatter, LogParser } from '@bernierllc/logging-utils';
import { LogEntry } from '@bernierllc/logging-core';

// Convert to different formats
const logEntry: LogEntry = {
  level: 'info',
  message: 'Application started',
  timestamp: new Date().toISOString(),
  metadata: { userId: 'user123' }
};

// Convert to different formats
const json = LogFormatter.toJSON(logEntry);
const text = LogFormatter.toText(logEntry);
const csv = LogFormatter.toCSV(logEntry);
const syslog = LogFormatter.toSyslog(logEntry);
const logfmt = LogFormatter.toLogfmt(logEntry);

// Parse from different formats
const parsedFromJson = LogParser.fromJSON(json);
const parsedFromText = LogParser.fromText(text);
const parsedFromCsv = LogParser.fromCSV(csv);

// Convert multiple logs
const logs: LogEntry[] = [logEntry1, logEntry2, logEntry3];
const jsonArray = LogFormatter.formatMultiple(logs, 'json');
const csvTable = LogFormatter.formatMultiple(logs, 'csv');

API Reference

Performance Monitoring

PerformanceLogger

  • startTimer(name: string): void
  • endTimer(name: string, metadata?: Record<string, unknown>): number
  • incrementCounter(name: string, value?: number): void
  • getMetrics(name: string): PerformanceMetrics | undefined
  • getAllMetrics(): Map<string, PerformanceMetrics>
  • reset(): void

MemoryTracker

  • takeSnapshot(name: string): NodeJS.MemoryUsage
  • getSnapshot(name: string): NodeJS.MemoryUsage | undefined
  • compareSnapshots(name1: string, name2: string): Record<string, number>
  • getMemoryUsagePercentage(): number

Log Analysis

LogAnalyzer

  • addLog(log: LogEntry): void
  • addLogs(logs: LogEntry[]): void
  • getLogsByLevel(level: LogLevel): LogEntry[]
  • getLogsByTimeRange(startTime: string, endTime: string): LogEntry[]
  • getLogsByText(searchText: string): LogEntry[]
  • getErrorAnalysis(): Record<string, number>
  • getPerformanceStats(): PerformanceStats
  • getUniqueUsers(): string[]
  • getUniqueRequestIds(): string[]

AuditLogAnalyzer

  • addAuditLog(log: AuditLogEntry): void
  • getAuditLogsByActor(actorId: string): AuditLogEntry[]
  • getAuditLogsByAction(action: string): AuditLogEntry[]
  • getAuditLogsByTarget(target: string): AuditLogEntry[]
  • getSuspiciousActivity(threshold?: number): Record<string, number>
  • getUserActivitySummary(): Record<string, UserActivity>

Log Aggregation

LogAggregator

  • addLog(log: LogEntry): void
  • addLogs(logs: LogEntry[]): void
  • flush(): LogEntry[]
  • setFlushCallback(callback: (logs: LogEntry[]) => void): void
  • stop(): void
  • getBatchSize(): number

LogDeduplicator

  • isDuplicate(log: LogEntry): boolean
  • clear(): void

LogRateLimiter

  • shouldAllowLog(level: LogLevel): boolean
  • getLogCounts(): Record<string, number>

Format Conversion

LogFormatter

  • toJSON(log: LogEntry): string
  • toText(log: LogEntry): string
  • toCSV(log: LogEntry): string
  • toSyslog(log: LogEntry): string
  • toLogfmt(log: LogEntry): string
  • formatMultiple(logs: LogEntry[], format: Format): string

LogParser

  • fromJSON(json: string): LogEntry
  • fromText(text: string): LogEntry
  • fromCSV(csv: string): LogEntry
  • parseMultiple(content: string, format: Format): LogEntry[]

Architecture

This package extends the logging ecosystem:

@bernierllc/logging-core (core package)
├── Types and interfaces
└── Environment detection

@bernierllc/logging (service package)
├── Winston integration
└── File logging with rotation

@bernierllc/logging-utils (this package)
├── Performance monitoring
├── Log analysis tools
├── Aggregation utilities
└── Format converters

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

License

MIT License - see LICENSE file for details.