webcodecs-polyfill
v1.0.1
Published
A polyfill for the WebCodecs API for use in server-side JavaScript environments such as Node, Deno, and Bun.
Maintainers
Readme
webcodecs-polyfill
A polyfill for the WebCodecs API for use in server-side JavaScript environments such as Node, Deno, and Bun.
If this library provided value to you, please consider sponsoring my work! 💖
What is this?
This project implements a (partial) polyfill for the WebCodecs API for server-side JavaScript environments. It enables access to hardware-accelerated video and audio decode and encode through a relatively-simple JavaScript API, instead of having to call out to FFmpeg child processes manually.
This opens the possibility for writing isomorphic code, meaning code that functions both in the browser as well as on the server. This means libraries that build on top of the WebCodecs API (such as Mediabunny, please star) can also run on the server.
Does it work?
Seems to work pretty well! Video and audio transcoding have been tested on multiple platforms, using different hardware acceleration backends, and it runs pretty fast, too! One way this was tested was by using the Mediabunny Conversion API, which hits virtually all features of the WebCodecs API.
How it works
This library reimplements the classes and interfaces as described in the WebCodecs API spec in pure TypeScript. For the heavy lifting of doing the actual encoding and decoding, node-av is used, which provides access to FFmpeg's C API through N-API. Using the C API directly instead of the CLI has several benefits: there are no additional requirements for running this library other than the JS runtime, and the code can be more optimized as well; for example, a zero-copy code path is used when doing transcode (decode + encode) operations.
node-av, the only dependency of this library, was essential to its creation and provides an excellent and super modern wrapper for FFmpeg's API - make sure to give them a star!
Shoutouts also go to libavjs-webcodecs-polyfill, which pioneered the buzzing "WebCodecs polyfill scene" and was a useful reference for developing this library.
Reflections
This library is my take on @vjeux's WebCodecs Node.js $10k Challenge. Playing around with running Mediabunny on the server was always something on the back of my mind, and this challenge gave me the motivation to play around with that. While getting the initial decode/encode cycle working was rather quick, the devil is, as always, in the details, and this is especially true for the WebCodecs API. The spec is very precise in how things should behave (especially with error types, tasks running in the microtask queue, etc.), so I put great care in trying to follow it exactly. Some interfaces, such as VideoFrame, also have a very large set of algorithms that must be implemented for them to behave to-spec, which was annoying to put together. At the end of the day, however, I'd say this library's implementation is, to the best of my ability, spec-compliant.
Connecting Mediabunny with FFmpeg's codec functionality directly is probably the easier and more pragmatic path; you just need to write the code to solve the problem instead of also having to conform to the WebCodecs spec at the same time. However, there is also value in having the base WebCodecs API that Mediabunny uses exposed for everyone; that way, people are free to use it however they wish.
Usage
Install the thing:
npm install webcodecs-polyfill
yarn add webcodecs-polyfill
pnpm add webcodecs-polyfill
bun add webcodecs-polyfillThen, polyfill the WebCodecs API on the global object by running:
import { polyfillWebCodecsApi } from 'webcodecs-polyfill';
polyfillWebCodecsApi(); // That's it!This exposes the following objects globally if they didn't already exist:
AudioDecoder
VideoDecoder
VideoEncoder
AudioEncoder
EncodedVideoChunk
EncodedAudioChunk
VideoFrame
VideoColorSpace
AudioData
DOMRectReadOnlyIf you don't want to "pollute" the global object, you may also import and use the polyfills directly:
import {
AudioDecoderPolyfill,
VideoDecoderPolyfill,
VideoEncoderPolyfill,
AudioEncoderPolyfill,
EncodedVideoChunkPolyfill,
EncodedAudioChunkPolyfill,
VideoFramePolyfill,
VideoColorSpacePolyfill,
AudioDataPolyfill,
DOMRectReadOnlyPolyfill,
} from 'webcodecs-polyfill';Additional types beyond the above objects are not provided by this library. For that, use the regular TypeScript DOM typings (recommended) or @types/dom-webcodecs.
For actually using the WebCodecs API, check out MDN's great documentation.
Feature set
Interfaces
These classes, which are basically the core of how the WebCodecs API is typically used, are fully implemented:
AudioDecoderVideoDecoderVideoEncoderAudioEncoderEncodedVideoChunkEncodedAudioChunkVideoFrameVideoColorSpaceAudioData
These, however, are not implemented at all:
ImageDecoderImageTrackListImageTrack
I consider image processing to be a more niche part of this API and didn't want to add unnecessary complexity to this polyfill. Image processing capability is already plentiful in Node.js, for example using sharp.
Codec support
All video and audio codecs listed in the WebCodecs Codec Registry are supported by this library, bidirectionally (encode and decode). Encoded chunk data and description byte formats follow the spec for all codecs.
Support for hardware-accelerated decoding and encoding is automagically probed and enabled if possible.
Codec operations, when they do run on the CPU, are typically multithreaded and therefore don't block the initiator thread (the thread on which the library runs).
Missing features
A couple of features have been omitted:
scalabilityModevideo encoder option; too nichealpha: 'keep'video encoder option; no browser currently supports this option. Functionality can be polyfilled by running two encoders/decoders in parallel, one for the color data and another for transparency data. Mediabunny does it like this.bitrateMode: 'quantizer'video encoder option; use variable or constant bitrate for now.- Initializing a
VideoFramefrom a canvas, or drawing to a canvas. Canvases don't exist in Node.js, so this functionality is impossible. When including a canvas polyfill library for Node, this functionality can be added by converting the canvas to an RGBA buffer when creating aVideoFrame, and by overwriting the.drawImage()method to add support forVideoFramearguments. transferoption forVideoFrameandAudioDataconstructors
License
Licensed under the Mozilla Public License 2.0. Usage is free, including commercial usage, without any warranty or liability. However, all code of this library must retain the license header when reused elsewhere, and any distributed forks must remain public in the spirit of open-source software.
Development
Nothing crazy:
npm install # For installing dependencies
npm run test # For running tests with Vitest
npm run check # Check TypeScript types
npm run lint # Run ESLint
npm run build # Create the 'dist' folder