npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

knox-mpu-alt

v0.2.2

Published

Provide multi part upload functionality to Amazon S3 using the knox library. Forked from knox-mpu.

Downloads

3,153

Readme

knox-mpu

Forked from knox-mpu.

A Node.js client designed to make large file uploads to Amazon S3 via the MultiPartUpload API simple and easy. It's built on top of the excellent Knox library from the guys over at LearnBoost.

Features

  • Simple and easy to use
  • Pipe either a file, or a stream directly to S3 (No need to know the content length first!)
  • Automatically separates a file/stream into appropriate sized segments for upload
  • Asynchronous uploading of segments
  • Handy events to track your upload progress

Planned

  • Better error handling (reuploading failed parts, etc)

Installing

Installation is done via NPM, by running npm install knox-mpu

Examples

Uploading a stream

To upload a stream, simply pass the stream when constructing the MultiPartUpload. The upload will then listen to the stream, and create parts from incoming data stream. When a part reaches the minimum part size, it will attempt to upload it to S3.


// Create a Knox client first
var client = knox.createClient({ ... }),
    upload = null;


upload = new MultiPartUpload(
            {
                client: client,
                objectName: 'destination.txt', // Amazon S3 object name
                stream: stream
            },
            // Callback handler
            function(err, body) {
                // If successful, will return body, containing Location, Bucket, Key, ETag and size of the object
                /*
                  {
                      Location: 'http://Example-Bucket.s3.amazonaws.com/destination.txt',
                      Bucket: 'Example-Bucket',
                      Key: 'destination.txt',
                      ETag: '"3858f62230ac3c915f300c664312c11f-9"',
                      size: 7242880
                  }
                */
            }
        );

Uploading a file

To upload a file, pass the path to the file in the constructor. Knox-mpu will split the file into parts and upload them.


// Create a Knox client first
var client = knox.createClient({ ... }),
    upload = null;


upload = new MultiPartUpload(
            {
                client: client,
                objectName: 'destination.txt', // Amazon S3 object name
                file: ... // path to the file
            },
            // Callback handler
            function(err, body) {
                // If successful, will return body, containing Location, Bucket, Key, ETag and size of the object
                /*
                  {
                      Location: 'http://Example-Bucket.s3.amazonaws.com/destination.txt',
                      Bucket: 'Example-Bucket',
                      Key: 'destination.txt',
                      ETag: '"3858f62230ac3c915f300c664312c11f-9"',
                      size: 7242880
                  }
                */
            }
        );

Options

The following options can be passed to the MultiPartUpload constructor -

  • client Required The knox client to use for this upload request
  • objectName Required The destination object name/path on S3 for this upload
  • stream The stream to upload (required if file is not being supplied)
  • file The path to the file (required if stream is not being supplied)
  • headers Any additional headers to include on the requests
  • partSize The minimum size of the parts to upload (default to 5MB).
  • batchSize The maximum number of concurrent parts that can be uploading at any one time (default is 4)
  • maxUploadSize The maximum size of the file to upload (default inifinity). Useful if there is a stream with unknown length.
  • noDisk If true, parts will be kept in-memory instead of written to temp files (default to false).
  • maxRetries Number of times to retry failed part upload (default is 0 for no retry).

Events

The MultiPartUpload will emit a number of events -

  • initiated Emitted when the multi part upload has been initiated, and received an upload ID. Passes the upload id through as the first argument to the event
  • uploading Emitted each time a part starts uploading. The part id is passed as the first argument.
  • uploaded Emitted each time a part finishes uploading. Passes through an object containing the part id and Amazon ETag for the uploaded part.
  • error Emitted each time a part upload fails. Passes an object containing the part id and error message
  • completed Emitted when the upload has completed successfully. Contains the object information from Amazon S3 (location, bucket, key and ETag)