npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@eyevinn/shaka-packager-s3

v0.8.0

Published

Shaka packager with S3

Readme

shaka-packager-s3

Badge OSC

CLI and library for creating a streaming bundle from an ABR bundle shaka-packager. Input and output can be in S3 buckets.

Requirements

shaka-packager executable must be available in path under the name packager. When using S3 for input and output the AWS CLI must be installed and configured.

Usage

CLI

> npm install -g shaka-packager-s3
> shaka-packager-s3 s3://source-bucket/folder s3://output-bucket/folder -i a:1=audio.mp4 -i v:1=video.mp4
> shaka-packager-s3 /path/to/source/folder /path/to/output/folder -i a:1=audio.mp4 -i v:1=video.mp4

Input format:
    [a|v|t]:<key>=<filename>[:hlsName]
    e.g. t:sv=subs.vtt:Swedish

Library

import { Input, doPackage } from '@eyevinn/shaka-packager-s3';

const inputs = [
  {
    type: 'audio',
    key: '1',
    filename: 'audio.mp4'
  },
  {
    type: 'video',
    key: '1',
    filename: 'video.mp4'
  }
];

const dest = '/my/output/folder';
doPackage({
  dest,
  inputs
})
  .then(() => {
    console.log('done');
  })
  .catch((err) => {
    console.error(err);
  });

Docker

docker build -t shaka-packager-s3:local .

Package an ABR bundle on S3 and upload to another S3 bucket

docker run --rm \
  -e AWS_ACCESS_KEY_ID=<aws-access-key-id> \
  -e AWS_SECRET_ACCESS_KEY=<aws-secret-access-key> \
  shaka-packager-s3:local \
  shaka-packager-s3 s3://source/abr s3://dest/vod \
  -i a:audio=snaxax_STEREO.mp4 \
  -i v:324=snaxax_x264_324.mp4 \
  -i v:1312=snaxax_x264_1312.mp4 \
  -i v:2069=snaxax_x264_2069.mp4 \
  -i v:3100=snaxax_x264_3100.mp4

Development

Prerequisites:

  • shaka-packager
  • AWS cli

Install Node dependencies

npm install

Build

npm run build

Run script locally

% node dist/cli.js -h
Usage: cli [options]

Run shaka-packager with source on S3 or locally, and output to S3 or local

  Examples:
    $ shaka-packager-s3 -i a:1=audio.mp4 -i v:1=video.mp4 -s s3://source-bucket/folder -d s3://output-bucket/folder
    $ shaka-packager-s3 -i a:1=audio.mp4 -i v:1=video.mp4 -s /path/to/source/folder -d /path/to/output/folder
    $ shaka-packager-s3 -i a:2=audio.mp4 -i v:1=video.mp4 -s /path/to/source/folder -d /path/to/output/folder --segment-single-file --segment-single-file-name 'Container$KEY$.mp4' --segment-duration 3.84



Options:
  -s, --source-folder [sourceFolder]                  Source folder URL, ignored if input uses absolute path (supported protocols: s3, local file)
  -i, --input [inputOptions...]                       Input options on the format: [a|v|t]:<key>=<filename>[:hlsName]
  --staging-dir [stagingDir]                          Staging directory (default: /tmp/data)
  --shaka-executable [shakaExecutable]                Path to shaka-packager executable, defaults to 'packager'. Can also be set with environment variable SHAKA_PACKAGER_EXECUTABLE.
  --no-implicit-audio [noImplicitAudio]               Do not include audio unless audio input specified
  -d, --destination-folder <dest>                     Destination folder URL (supported protocols: s3, local file). Defaults to CWD.
  --endpoint-url [s3EndpointUrl]                      S3 endpoint URL
  --dash-only                                         Package only DASH format
  --hls-only                                          Package only HLS format
  --segment-single-file                               Use byte range addressing and a single segment file per stream
  --segment-single-file-name [segmentSingleFileName]  Template for single segment file name, $KEY$ will be replaced with stream key
  --segment-duration [segmentDuration]                Segment target duration
  --ts-output                                         Output TS (.ts for video, .aac for audio) segments instead of fragmented MP4 (CMAF)
  -h, --help                                          display help for command

Support

Join our community on Slack where you can post any questions regarding any of our open source projects. Eyevinn's consulting business can also offer you:

  • Further development of this component
  • Customization and integration of this component into your platform
  • Support and maintenance agreement

Contact [email protected] if you are interested.

About Eyevinn Technology

Eyevinn Technology is an independent consultant firm specialized in video and streaming. Independent in a way that we are not commercially tied to any platform or technology vendor. As our way to innovate and push the industry forward we develop proof-of-concepts and tools. The things we learn and the code we write we share with the industry in blogs and by open sourcing the code we have written.

Want to know more about Eyevinn and how it is to work here. Contact us at [email protected]!