npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@heizen-labs/logger

v1.0.0

Published

Lightweight, in-process logging client with async batch processing for TypeScript/Node.js applications

Readme

Heizen Logger

A lightweight, in-process TypeScript/Node.js logging client that standardizes log format, buffers logs in memory, batches them based on configurable thresholds, and asynchronously pushes logs to a configured HTTP endpoint using a Heizen Project API key for authentication.

npm version Build Status Coverage Status

Features

🚀 High Performance: 116,000+ logs/second throughput capability
🏗️ NestJS Integration: Built-in module with automatic request/response logging
🔒 Type Safe: Full TypeScript support with strict typing
Async Batching: Configurable time and size-based batching
🛡️ Reliable: Graceful error handling and memory bounds
🗜️ Compression: Automatic GZIP compression for large payloads
📊 Observability: Built-in metrics and monitoring

Table of Contents

Installation

pnpm add @heizen/logger

For NestJS Projects

pnpm add @heizen/logger @nestjs/common @nestjs/core rxjs reflect-metadata

Quick Start

Standalone Usage

For standalone Node.js applications, use the HeizenLogger class directly:

import { HeizenLogger } from '@heizen/logger';

// Initialize the logger
const logger = new HeizenLogger({
  projectId: 'your-project-id',
  projectName: 'My App',
  environment: 'production',
  apiKey: 'your-heizen-api-key',
  endpoint: 'https://api.heizen.com/logs',
  batchingConfig: {
    flushIntervalMs: 30000, // Flush every 30 seconds
    batchSize: 100, // Or when 100 logs accumulated
    maxQueueSize: 10000, // Maximum in-memory logs
  },
});

// Log messages with strict typed signatures
logger.log('INFO', 'Application started');
logger.log('ERROR', 'Failed to process request', {
  userId: 'user-123',
  error: 'Database connection failed',
});

// Convenience methods
logger.info('User logged in', { userId: 'user-123' });
logger.debug('Processing request', { requestId: 'req-456' });
logger.warn('API rate limit approaching', { remaining: 10 });
logger.error('Database error', { code: 'ECONNREFUSED' });
logger.fatal('Application crash', { error: 'Out of memory' });

// Graceful shutdown
process.on('SIGTERM', async () => {
  await logger.shutdown();
  process.exit(0);
});

NestJS Integration

For NestJS applications, use the HeizenLoggerModule with automatic request logging:

// app.module.ts
import { Module } from '@nestjs/common';
import { HeizenLoggerModule } from '@heizen/logger';

@Module({
  imports: [
    HeizenLoggerModule.forRoot(
      {
        projectId: 'your-project-id',
        projectName: 'NestJS API',
        environment: 'production',
        apiKey: 'your-heizen-api-key',
        endpoint: 'https://api.heizen.com/logs',
        batchingConfig: {
          flushIntervalMs: 5000,
          batchSize: 100,
          maxQueueSize: 10000,
        },
      },
      {
        enableInterceptor: true, // Enable automatic HTTP request logging
        skipPaths: ['/health', '/metrics'], // Skip logging for these paths
      }
    ),
  ],
})
export class AppModule {}
// users.service.ts
import { Injectable } from '@nestjs/common';
import { HeizenLoggerService } from '@heizen/logger';

@Injectable()
export class UsersService {
  constructor(private readonly logger: HeizenLoggerService) {}

  async createUser(data: any) {
    // Flexible NestJS-compatible logging with parameter normalization
    this.logger.log('Creating new user', 'UsersService', { email: data.email });

    const user = { id: Date.now(), ...data };

    this.logger.log('User created successfully', 'UsersService', { userId: user.id });
    return user;
  }
}

📚 For complete NestJS integration guide, see NestJS Quick Start

Configuration

LoggerConfig

| Property | Type | Required | Description | | ---------------- | ---------------- | -------- | ------------------------------------------------ | | projectId | string | ✓ | Unique identifier for your project | | projectName | string | ✓ | Human-readable project name | | environment | string | ✓ | Environment name (e.g., 'production', 'staging') | | apiKey | string | ✓ | Heizen Project API key for authentication | | endpoint | string | ✓ | HTTP endpoint for log ingestion | | batchingConfig | BatchingConfig | ✓ | Batching behavior configuration |

BatchingConfig

| Property | Type | Default | Description | | ------------------ | -------- | ------- | -------------------------------------- | | flushIntervalMs | number | 30000 | Maximum time between flushes (ms) | | batchSize | number | 100 | Maximum logs per batch | | maxQueueSize | number | 10000 | Maximum in-memory logs before dropping | | requestTimeoutMs | number | 5000 | HTTP request timeout (ms) |

NestJSLoggerOptions

| Property | Type | Default | Description | | ------------------- | ---------- | ------- | ------------------------------------- | | enableInterceptor | boolean | true | Enable automatic HTTP request logging | | skipPaths | string[] | [] | Paths to skip logging (glob patterns) |

API Reference

The library provides two distinct APIs depending on your use case:

| API | Use Case | Signatures | Parameter Handling | | ----------------------- | ----------------------- | ---------------------------------------------- | ------------------ | | HeizenLogger | Standalone applications | Strict: method(message, metadata?) | Typed, strict | | HeizenLoggerService | NestJS applications | Flexible: method(message, ...optionalParams) | Normalized |


Standalone API: HeizenLogger

Core logger class for direct usage in any TypeScript/Node.js application with strict type signatures.

Constructor

constructor(config: LoggerConfig)

Logging Methods

All methods follow a strict signature pattern:

method(message: string, metadata?: Record<string, unknown>): void
  • message must be a string
  • metadata must be an object (if provided)
  • No automatic parameter normalization

log(level, message, metadata?)

log(level: LogLevel, message: string, metadata?: Record<string, unknown>): void

Core logging method with explicit level.

Parameters:

  • level: 'DEBUG' | 'INFO' | 'WARN' | 'ERROR' | 'FATAL'
  • message: Log message string
  • metadata: Optional structured data

Example:

logger.log('INFO', 'User login successful', {
  userId: 'user-123',
  ip: '192.168.1.1',
  duration: 245,
});

Convenience Methods

debug(message, metadata?)
debug(message: string, metadata?: Record<string, unknown>): void

Logs a DEBUG level message.

logger.debug('Processing request', { requestId: 'req-123' });
info(message, metadata?)
info(message: string, metadata?: Record<string, unknown>): void

Logs an INFO level message.

logger.info('User logged in', { userId: 'user-123' });
warn(message, metadata?)
warn(message: string, metadata?: Record<string, unknown>): void

Logs a WARN level message.

logger.warn('API rate limit approaching', { remaining: 10 });
error(message, metadata?)
error(message: string, metadata?: Record<string, unknown>): void

Logs an ERROR level message.

logger.error('Database connection failed', { host: 'localhost', port: 5432 });
fatal(message, metadata?)
fatal(message: string, metadata?: Record<string, unknown>): void

Logs a FATAL level message for critical errors.

logger.fatal('Application crash', { error: 'Out of memory', pid: process.pid });

Control Methods

flush()

async flush(): Promise<void>

Manually triggers a flush of buffered logs to the endpoint.

await logger.flush();

shutdown(timeoutMs?)

async shutdown(timeoutMs?: number): Promise<void>

Gracefully shuts down the logger, flushing all remaining logs.

Parameters:

  • timeoutMs: Maximum time to wait for final flush (default: 5000ms)
await logger.shutdown(10000); // Wait up to 10 seconds

Information Methods

getMetrics()

getMetrics(): LoggerMetrics

Returns current logger metrics and statistics.

Returns:

interface LoggerMetrics {
  totalLogsProcessed: number;
  batchesSent: number;
  batchesFailed: number;
  currentQueueSize: number;
  lastSuccessfulFlush: string | null; // ISO timestamp
  lastFailedFlush: string | null; // ISO timestamp
  droppedLogsCount: number;
}

getBufferStats()

getBufferStats(): RingBufferStats

Returns current ring buffer statistics.

Returns:

interface RingBufferStats {
  size: number; // Current number of logs in buffer
  capacity: number; // Maximum buffer capacity
  isEmpty: boolean;
  isFull: boolean;
  availableSpace: number;
  droppedLogsCount: number; // Total logs dropped due to overflow
}

getState()

getState(): LoggerState

Returns current logger state.

Returns:

type LoggerState = 'INITIALIZING' | 'READY' | 'FLUSHING' | 'SHUTTING_DOWN' | 'SHUTDOWN';

getConfig()

getConfig(): Omit<LoggerConfig, 'apiKey'>

Returns logger configuration with API key removed for security.


NestJS API: HeizenLoggerService

NestJS-compatible logger service with flexible parameter handling and automatic normalization.

HeizenLoggerModule

Global module providing logger services across your NestJS application.

forRoot(config, options?)

static forRoot(
  config: LoggerConfig,
  options?: NestJSLoggerOptions
): DynamicModule

Example:

@Module({
  imports: [
    HeizenLoggerModule.forRoot(
      {
        projectId: 'your-project-id',
        projectName: 'NestJS API',
        environment: 'production',
        apiKey: 'your-heizen-api-key',
        endpoint: 'https://api.heizen.com/logs',
        batchingConfig: {
          flushIntervalMs: 5000,
          batchSize: 100,
          maxQueueSize: 10000,
        },
      },
      {
        enableInterceptor: true,
        skipPaths: ['/health', '/metrics'],
      }
    ),
  ],
})
export class AppModule {}

HeizenLoggerService Methods

All methods follow NestJS standard signature with automatic parameter normalization:

method(message: any, ...optionalParams: any[]): void

Parameter Normalization

The service automatically normalizes optional parameters:

  1. Object after message → Treated as metadata

    this.logger.log('User created', { userId: 123 });
    // Result: { metadata: { userId: 123 } }
  2. String after message → Treated as context (service/controller name)

    this.logger.log('User created', 'UsersService');
    // Result: { metadata: { context: 'UsersService' } }
  3. String + Object → String becomes context, object becomes metadata

    this.logger.log('User created', 'UsersService', { userId: 123 });
    // Result: { metadata: { context: 'UsersService', userId: 123 } }
  4. Additional parameters → Added to metadata as extra array

    this.logger.log('User created', 'UsersService', { userId: 123 }, 'extra-data');
    // Result: { metadata: { context: 'UsersService', userId: 123, extra: ['extra-data'] } }

Available Methods

log(message, ...optionalParams)
log(message: any, ...optionalParams: any[]): void

Logs at INFO level with flexible parameters.

this.logger.log('User logged in', 'AuthService', { userId: 123 });
error(message, trace?, ...optionalParams)
error(message: any, trace?: string, ...optionalParams: any[]): void

Logs at ERROR level with optional stack trace.

this.logger.error('Database connection failed', error.stack, 'DatabaseService');
warn(message, ...optionalParams)
warn(message: any, ...optionalParams: any[]): void

Logs at WARN level.

this.logger.warn('API rate limit approaching', { remaining: 10, limit: 100 });
debug(message, ...optionalParams)
debug(message: any, ...optionalParams: any[]): void

Logs at DEBUG level.

this.logger.debug('Processing request', { requestId: 'req-123', method: 'GET' });
verbose(message, ...optionalParams)
verbose(message: any, ...optionalParams: any[]): void

Logs at VERBOSE level (mapped to DEBUG in core logger).

this.logger.verbose('Query executed', { sql: 'SELECT * FROM users', duration: 45 });
http(message, ...optionalParams)
http(message: any, ...optionalParams: any[]): void

Logs HTTP requests (mapped to INFO in core logger). Used by HeizenLoggerInterceptor.

this.logger.http('GET /api/users', { statusCode: 200, responseTime: 123 });
fatal(message, ...optionalParams)
fatal(message: any, ...optionalParams: any[]): void

Logs FATAL level messages for critical errors.

this.logger.fatal('Application crash', { error: 'Out of memory', pid: process.pid });
customLog(level, message, ...optionalParams)
customLog(level: LogLevel, message: any, ...optionalParams: any[]): void

Logs at a custom level with flexible parameters.

this.logger.customLog('WARN', 'Payment processing delayed', { amount: 99.99 });

Level Mapping

| NestJS Method | Core Logger Level | | ------------- | ----------------- | | log() | INFO | | error() | ERROR | | warn() | WARN | | debug() | DEBUG | | verbose() | DEBUG | | http() | INFO | | fatal() | FATAL |

HeizenLoggerInterceptor

Automatically logs HTTP requests and responses when enabled.

Features:

  • Only intercepts HTTP contexts (skips WebSocket, gRPC, etc.)
  • Logs request duration in milliseconds
  • Captures errors with stack traces
  • Path filtering with wildcard support

Logged Metadata:

  • Success: { statusCode, durationMs }
  • Error: { method, url, statusCode, durationMs, stack }

Path Filtering Examples:

skipPaths: ['/health', '/metrics', '/api/*/internal/*', '/debug/*'];

API Comparison

| Feature | HeizenLogger (Standalone) | HeizenLoggerService (NestJS) | | ------------------------ | ------------------------------ | ------------------------------------------ | | Message parameter | string only | any (auto-converted) | | Metadata parameter | Record<string, unknown> only | Flexible: string (context) or object | | Parameter handling | Strict, typed | Flexible, normalized | | Signature | method(message, metadata?) | method(message, ...optionalParams) | | Context support | ❌ Manual (add to metadata) | ✅ Built-in via parameter normalization | | NestJS compatibility | ❌ | ✅ Implements LoggerService | | Level mapping | Direct (1:1) | verboseDEBUG, httpINFO | | Error stack trace | Manual (add to metadata) | Built-in via error(message, trace?, ...) | | Use case | Standalone apps, libraries | NestJS applications with DI |

When to use which API:

  • Use HeizenLogger when you want strict typing and control over log structure
  • Use HeizenLoggerService when you need NestJS compatibility and flexible parameter patterns

Advanced Usage

Monitoring Logger Health

setInterval(() => {
  const metrics = logger.getMetrics();
  const bufferStats = logger.getBufferStats();
  const state = logger.getState();

  if (state !== 'READY') {
    console.warn(`Logger state: ${state}`);
  }

  if (bufferStats.isFull) {
    console.warn('Logger buffer is full!', {
      droppedLogs: bufferStats.droppedLogsCount,
      currentSize: bufferStats.size,
      capacity: bufferStats.capacity,
    });
  }

  if (metrics.batchesFailed > 0) {
    console.error('Logger has failed batches:', {
      failed: metrics.batchesFailed,
      lastFailure: metrics.lastFailedFlush,
    });
  }
}, 60000); // Check every minute

Graceful Shutdown

async function gracefulShutdown() {
  console.log('Shutting down gracefully...');

  try {
    // Flush remaining logs with 10 second timeout
    await logger.shutdown(10000);
    console.log('Logger shutdown complete');
  } catch (error) {
    console.error('Logger shutdown failed:', error);
  }

  process.exit(0);
}

process.on('SIGTERM', gracefulShutdown);
process.on('SIGINT', gracefulShutdown);

Performance

The Heizen Logger is designed for high-performance logging with minimal overhead:

  • Throughput: 116,000+ logs/second capability
  • Batching: Time and size-based batching reduces HTTP overhead
  • Ring Buffer: Lock-free circular buffer prevents memory leaks
  • Async I/O: Non-blocking HTTP transport doesn't block application logic
  • Memory Bounds: Configurable max queue size with automatic overflow handling
  • Compression: Automatic GZIP compression for payloads > 1KB

Stress Testing

Run performance benchmarks after starting the mock-log-server and mock-nestjs-server:

# Run logger stress tests
pnpm run stress:logger

# Run demo API stress tests
pnpm run stress:demo

# Run all stress tests
pnpm run stress:all

📊 For complete stress testing documentation, see STRESS_TESTING.md

Examples

Example 1: Standalone Node.js Application

import { HeizenLogger } from '@heizen/logger';

const logger = new HeizenLogger({
  projectId: 'my-app-001',
  projectName: 'My Application',
  environment: process.env.NODE_ENV || 'development',
  apiKey: process.env.HEIZEN_API_KEY!,
  endpoint: 'https://api.heizen.com/logs',
  batchingConfig: {
    flushIntervalMs: 5000,
    batchSize: 100,
    maxQueueSize: 10000,
  },
});

// Log application startup
logger.info('Application starting', { version: '1.0.0' });

// Example: Processing user requests
async function handleUserRequest(userId: string) {
  logger.debug('Processing user request', { userId });

  try {
    const result = await processUser(userId);
    logger.info('User request processed successfully', { userId, result });
    return result;
  } catch (error) {
    logger.error('Failed to process user request', {
      userId,
      error: error.message,
      stack: error.stack,
    });
    throw error;
  }
}

// Graceful shutdown
async function shutdown() {
  logger.info('Application shutting down');
  await logger.shutdown(10000);
  process.exit(0);
}

process.on('SIGTERM', shutdown);
process.on('SIGINT', shutdown);

async function processUser(userId: string) {
  // Your business logic here
  return { success: true };
}

Example 2: NestJS Application

// app.module.ts
import { Module } from '@nestjs/common';
import { HeizenLoggerModule } from '@heizen/logger';
import { UsersModule } from './users/users.module';

@Module({
  imports: [
    HeizenLoggerModule.forRoot(
      {
        projectId: 'microservice-001',
        projectName: 'User Microservice',
        environment: process.env.NODE_ENV || 'development',
        apiKey: process.env.HEIZEN_API_KEY!,
        endpoint: 'https://api.heizen.com/logs',
        batchingConfig: {
          flushIntervalMs: 3000,
          batchSize: 200,
          maxQueueSize: 20000,
        },
      },
      {
        enableInterceptor: true,
        skipPaths: ['/health', '/metrics'],
      }
    ),
    UsersModule,
  ],
})
export class AppModule {}
// users.service.ts
import { Injectable } from '@nestjs/common';
import { HeizenLoggerService } from '@heizen/logger';

@Injectable()
export class UsersService {
  constructor(private readonly logger: HeizenLoggerService) {}

  async findAll() {
    this.logger.debug('Fetching all users', 'UsersService');
    const users = []; // Your database query
    this.logger.log('Found users', 'UsersService', { count: users.length });
    return users;
  }

  async findOne(id: string) {
    this.logger.debug('Fetching user by ID', 'UsersService', { userId: id });

    try {
      const user = { id, name: 'John Doe' }; // Your database query
      this.logger.log('User found', 'UsersService', { userId: id });
      return user;
    } catch (error) {
      this.logger.error('Failed to fetch user', error.stack, 'UsersService', { userId: id });
      throw error;
    }
  }

  async create(createUserDto: any) {
    this.logger.log('Creating new user', 'UsersService', {
      email: createUserDto.email,
    });

    try {
      const user = { id: Date.now(), ...createUserDto }; // Your database insert
      this.logger.log('User created successfully', 'UsersService', {
        userId: user.id,
      });
      return user;
    } catch (error) {
      this.logger.error('Failed to create user', error.stack, 'UsersService', {
        email: createUserDto.email,
      });
      throw error;
    }
  }
}
// users.controller.ts
import { Controller, Get, Post, Body, Param } from '@nestjs/common';
import { UsersService } from './users.service';

@Controller('users')
export class UsersController {
  constructor(private readonly usersService: UsersService) {}

  @Get()
  findAll() {
    return this.usersService.findAll();
  }

  @Get(':id')
  findOne(@Param('id') id: string) {
    return this.usersService.findOne(id);
  }

  @Post()
  create(@Body() createUserDto: any) {
    return this.usersService.create(createUserDto);
  }
}

Example 3: Background Job Processor

import { HeizenLogger } from '@heizen/logger';

class JobProcessor {
  private logger: HeizenLogger;

  constructor() {
    this.logger = new HeizenLogger({
      projectId: 'job-processor-001',
      projectName: 'Background Jobs',
      environment: 'production',
      apiKey: process.env.HEIZEN_API_KEY!,
      endpoint: 'https://api.heizen.com/logs',
      batchingConfig: {
        flushIntervalMs: 2000,
        batchSize: 50,
        maxQueueSize: 5000,
      },
    });
  }

  async processJob(job: any) {
    const startTime = Date.now();

    this.logger.info('Job processing started', {
      jobId: job.id,
      jobType: job.type,
      priority: job.priority,
    });

    try {
      await this.simulateWork(job);

      const duration = Date.now() - startTime;
      this.logger.info('Job completed successfully', {
        jobId: job.id,
        duration,
        result: 'success',
      });
    } catch (error) {
      const duration = Date.now() - startTime;
      this.logger.error('Job failed', {
        jobId: job.id,
        duration,
        error: error.message,
        stack: error.stack,
      });
      throw error;
    }
  }

  private async simulateWork(job: any): Promise<void> {
    const workTime = Math.random() * 1000 + 500;
    await new Promise((resolve) => setTimeout(resolve, workTime));

    if (Math.random() < 0.1) {
      throw new Error('Random job failure');
    }
  }

  async shutdown() {
    this.logger.info('Job processor shutting down');
    await this.logger.shutdown();
  }
}

// Usage
const processor = new JobProcessor();

setInterval(async () => {
  const job = {
    id: Date.now(),
    type: 'email',
    priority: Math.floor(Math.random() * 5) + 1,
  };

  try {
    await processor.processJob(job);
  } catch (error) {
    console.error('Job processing failed:', error.message);
  }
}, 1000);

process.on('SIGTERM', async () => {
  await processor.shutdown();
  process.exit(0);
});

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

# Clone the repository
git clone https://github.com/heizen-team/heizen-logger.git
cd heizen-logger

# Install dependencies
pnpm install

# Run tests
pnpm test

# Run type checking
pnpm run type-check

# Build the project
pnpm run build

# Start mock server for testing
pnpm run mock-server

Running Tests

# Run all tests
pnpm test

# Run tests with coverage
pnpm run test:coverage

# Run tests in watch mode
pnpm run test:watch

License

MIT License. See LICENSE for details.

Support


Made with ❤️ by the Heizen Team