webgpu-waveform
v1.1.0
Published
Render waveforms to `<canvas />` using [WebGPU](https://developer.mozilla.org/en-US/docs/Web/API/WebGPU_API)
Downloads
782
Readme
webgpu-waveform
Render waveforms to <canvas />
using WebGPU
Examples
Visit https://aykev.dev/webgpu-waveform/ for examples
Usage
This package is distributed for both usage with ESM and UMD. It inclused TypeScript types too. Install from the npm registry:
npm i webgpu-waveform
There's three ways to use this library:
- as a class:
GPUWaveformRenderer
(vanilla JS) - as a hook:
useWaveformRenderer
(React) - as a component:
GPUWaveform
(React)
The class GPUWaveformRenderer
is initialized using the static method .create(...)
. It has the following definition:
static async create(
canvas: HTMLCanvasElement,
channelData: Float32Array
): GPUWaveformRenderer
It takes in the following arguments:
canvas: HTMLCanvasElement
— the canvas element to render tochannelData: Float32Array
— the array of PCM samples to render
Example:
async function example(canvas, audioBuffer) {
const channelData = audioBuffer.getChannelData(0);
const renderer = await GPUWaveformRenderer.create(canvas, channelData);
renderer?.render(800, 0, canvas.width, canvas.height);
}
The hook useWaveformRenderer
has the following signature:
function useWaveformRenderer(
canvasRef: React.RefObject<HTMLCanvasElement>,
audioBuffer: AudioBuffer
): RendererStatus;
It takes in the following arguments:
canvasRef: React.RefObject<HTMLCanvasElement>
— the canvas element to render toaudioBuffer: AudioBuffer
— the buffer to render
And returns an object of the following type:
type RendererStatus =
| { status: "initializing" }
| { status: "error"; error: any }
| { status: "ready"; instance: GPUWaveformRenderer };
The objects are returned during the following stages of initialization:
{ status: "initializing" }
— during setup, when connecting to the GPU device{ status: "error"; error: any }
— if an error happens at initalization{ status: "ready"; instance: GPUWaveformRenderer }
— if the webgpu device could be initialized, setup was successful, and a renderer foraudioBuffer
oncanvas
could be successfully created.
Example:
function Example({ audioBuffer, width, height }) {
const canvasRef = useRef < HTMLCanvasElement > null;
const renderer = useWaveformRenderer(canvasRef, audioBuffer);
useEffect(() => {
if (renderer.status !== "ready") {
return;
}
renderer?.render(audioBuffer.length / width, 0, width, height);
}, [renderer, audioBuffer, width, height]);
return <canvas ref={canvasRef} width={width} height={height} />;
}
The component GPUWaveform
takes the following properties:
audioBuffer: AudioBuffer;
— the buffer to renderscale?: number;
— the "zoom" level. Namely, number of samples per pixel in the x axisoffset?: number;
— the number of samples to skip from the beggining of the buffer- ...and all the props of
React.CanvasHTMLAttributes<HTMLCanvasElement>
— these are passed directly to the rendered canvas
Example:
export function Example({ audioBuffer }) {
return (
<GPUWaveform
audioBuffer={audioBuffer}
scale={800}
width={300}
height={100}
/>
);
}
Contributions
Feedback and PRs are welcome! If you send a PR, feel free to add yourself to the list of contributors below:
Contributors:
- Kevin Chavez (@aykev)