npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

winston-batch-transport

v1.0.2

Published

A robust Winston transport that batches logs and sends them to a specified API endpoint with support for retries, compression, and concurrent batch processing.

Readme

Winston Batch Transport

build npm version License: MIT

A robust Winston transport that batches logs and sends them to a specified API endpoint with support for retries, compression, and concurrent batch processing.

Table of Contents

Features

  • Batch log processing with a configurable batch size
  • Automatic retry mechanism with exponential backoff
  • Concurrent batch processing support
  • Compression support for reduced network bandwidth
  • Local backup for failed log entries
  • Configurable request timeouts
  • Log validation and sanitization

Installation

npm install winston-batch-transport
# or
yarn add winston-batch-transport
# or
pnpm add winston-batch-transport

Quick Start

import winston from 'winston';
import BatchTransport from 'winston-batch-transport';

const logger = winston.createLogger({
  transports: [
    new BatchTransport({
      batchSize: 100,
      flushInterval: 5000,
      apiUrl: 'https://your-logging-api.com/logs'
    })
  ]
});

logger.info('Hello, World!');

Configuration Options

| Option | Type | Default | Description | |--------|------|---------|-------------| | batchSize | number | Required | Maximum number of logs to batch before sending | | flushInterval | number | Required | Interval in milliseconds to flush logs regardless of batch size | | apiUrl | string | Required | The endpoint where logs will be sent | | apiKey | string | Optional | API key for authentication (sent as Bearer token) | | retryLimit | number | 3 | Maximum number of retry attempts for failed requests | | backoffFactor | number | 1000 | Base milliseconds for exponential backoff between retries | | backupFilePath | string | './unsent-logs.json' | Path to store logs that failed to send after all retries | | requestTimeout | number | 5000 | Timeout in milliseconds for API requests | | maxConcurrentBatches | number | 3 | Maximum number of batches that can be sent simultaneously | | useCompression | boolean | false | Enable gzip compression for log batches |

Advanced Usage

With Compression

const transport = new BatchTransport({
  batchSize: 100,
  flushInterval: 5000,
  apiUrl: 'https://your-logging-api.com/logs',
  useCompression: true
});

With API Key Authentication

const transport = new BatchTransport({
  batchSize: 100,
  flushInterval: 5000,
  apiUrl: 'https://your-logging-api.com/logs',
  apiKey: 'your-api-key-here'
});

With Custom Retry Settings

const transport = new BatchTransport({
  batchSize: 100,
  flushInterval: 5000,
  apiUrl: 'https://your-logging-api.com/logs',
  retryLimit: 5,
  backoffFactor: 2000
});

With Concurrent Batch Processing

const transport = new BatchTransport({
  batchSize: 100,
  flushInterval: 5000,
  apiUrl: 'https://your-logging-api.com/logs',
  maxConcurrentBatches: 5
});

API Reference

Log Format

Each log entry follows this structure:

interface LogEntry {
  level: string;      // Log level (e.g., 'info', 'error')
  message: string;    // Log message
  timestamp: string;  // ISO 8601 timestamp
}

Methods

constructor(opts: BatchTransportOptions)

Initializes a new BatchTransport instance with the specified options. Note that asynchronous initialization is handled by the init method.

init(): Promise<void>

Asynchronously initializes the transport, loading any backed-up logs. This method must be called after the constructor.

Example Usage:

const transport = new BatchTransport({
  batchSize: 100,
  flushInterval: 5000,
  apiUrl: 'https://your-logging-api.com/logs'
});

async function initialize() {
  await transport.init();
  console.log('Batch transport initialized.');
}

initialize();

log(info: any, callback: () => void)

Adds a log entry to the queue. Called internally by Winston.

close(): Promise<void>

Asynchronously cleans up resources and ensures all pending logs are processed before shutdown.

Example Usage:

// In your application shutdown logic
process.on('beforeExit', async () => {
  console.log('Application is shutting down. Flushing remaining logs...');
  await batchTransportInstance.close();
  console.log('All logs flushed. Goodbye!');
});

Error Handling

The transport handles errors in multiple ways:

  1. Retry Mechanism: Failed requests are retried with exponential backoff
  2. Backup Storage: Logs that fail after all retries are stored locally
  3. Validation: Logs are validated and sanitized before sending

Best Practices

  1. Batch Size: Choose a batch size that balances between latency and throughput
  2. Flush Interval: Set based on your application's log volume and latency requirements
  3. Compression: Enable for large log volumes or bandwidth-constrained environments
  4. Concurrent Batches: Adjust based on your API endpoint's capacity

Troubleshooting

Common Issues

  1. High Memory Usage

    • Reduce batch size
    • Decrease flush interval
    • Enable compression
  2. Lost Logs

    • Check backup file location
    • Increase retry limit
    • Verify API endpoint stability
  3. Poor Performance

    • Adjust concurrent batches
    • Optimize batch size
    • Enable compression

Contributing

We welcome contributions! Please see our Contributing Guide for more details.

License

This project is licensed under the MIT License - see the LICENSE file for details.