npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

streamroller

v3.1.5

Published

file streams that roll over when size limits, or dates are reached

Downloads

15,300,340

Readme

streamroller CodeQL Node.js CI

NPM

node.js file streams that roll over when they reach a maximum size, or a date/time.

npm install streamroller

usage

var rollers = require('streamroller');
var stream = new rollers.RollingFileStream('myfile', 1024, 3);
stream.write("stuff");
stream.end();

The streams behave the same as standard node.js streams, except that when certain conditions are met they will rename the current file to a backup and start writing to a new file.

new RollingFileStream(filename [, maxSize, numBackups, options])

  • filename <string>
  • maxSize <integer> - defaults to 0 - the size in bytes to trigger a rollover. If not specified or 0, then no log rolling will happen.
  • numBackups <integer> - defaults to 1 - the number of old files to keep (excluding the hot file)
  • options <Object>
    • encoding <string> - defaults to 'utf8'
    • mode <integer> - defaults to 0o600 (see node.js file modes)
    • flags <string> - defaults to 'a' (see node.js file flags)
    • compress <boolean> - defaults to false - compress the backup files using gzip (backup files will have .gz extension)
    • keepFileExt <boolean> - defaults to false - preserve the file extension when rotating log files (file.log becomes file.1.log instead of file.log.1).
    • fileNameSep <string> - defaults to '.' - the filename separator when rolling. e.g.: abc.log.1 or abc.1.log (keepFileExt)

This returns a WritableStream. When the current file being written to (given by filename) gets up to or larger than maxSize, then the current file will be renamed to filename.1 and a new file will start being written to. Up to numBackups of old files are maintained, so if numBackups is 3 then there will be 4 files:

When filename size >= maxSize then:

new DateRollingFileStream(filename [, pattern, options])

  • filename <string>
  • pattern <string> - defaults to yyyy-MM-dd - the date pattern to trigger rolling (see below)
  • options <Object>
    • encoding <string> - defaults to 'utf8'
    • mode <integer> - defaults to 0o600 (see node.js file modes)
    • flags <string> - defaults to 'a' (see node.js file flags)
    • compress <boolean> - defaults to false - compress the backup files using gzip (backup files will have .gz extension)
    • keepFileExt <boolean> - defaults to false - preserve the file extension when rotating log files (file.log becomes file.2017-05-30.log instead of file.log.2017-05-30).
    • fileNameSep <string> - defaults to '.' - the filename separator when rolling. e.g.: abc.log.2013-08-30 or abc.2013-08-30.log (keepFileExt)
    • alwaysIncludePattern <boolean> - defaults to false - extend the initial file with the pattern
    • daysToKeep numBackups <integer> - defaults to 1 - the number of old files that matches the pattern to keep (excluding the hot file)
    • maxSize <integer> - defaults to 0 - the size in bytes to trigger a rollover. If not specified or 0, then no log rolling will happen.

This returns a WritableStream. When the current time, formatted as pattern, changes then the current file will be renamed to filename.formattedDate where formattedDate is the result of processing the date through the pattern, and a new file will begin to be written. Streamroller uses date-format to format dates, and the pattern should use the date-format format. e.g. with a pattern of "yyyy-MM-dd", and assuming today is August 29, 2013 then writing to the stream today will just write to filename. At midnight (or more precisely, at the next file write after midnight), filename will be renamed to filename.2013-08-29 and a new filename will be created. If options.alwaysIncludePattern is true, then the initial file will be filename.2013-08-29 and no renaming will occur at midnight, but a new file will be written to with the name filename.2013-08-30. If maxSize is populated, when the current file being written to (given by filename) gets up to or larger than maxSize, then the current file will be renamed to filename.pattern.1 and a new file will start being written to. Up to numBackups of old files are maintained, so if numBackups is 3 then there will be 4 files: