audio-mixer-engine
v1.0.0
Published
Audio engine library for audio mixer applications with MIDI parsing, playback, and synthesis
Downloads
398
Maintainers
Readme
Audio Mixer Engine
A part-centric JavaScript audio library for mixer applications. Provides individual audio outputs per musical part (soprano, alto, piano, etc.) enabling per-part gain control, level monitoring, and flexible routing.
Features
- Part-centric audio: Individual
AudioNodeoutputs per part with separate gain/analysis routing - MIDI processing: Parse files, beat/bar mapping, tempo changes, metadata extraction
- Dual audio engines: SpessaSynth (soundfont, ~10-50MB) or Lightweight (samples, ~574KB bundle)
- Playback control: Metronome, lead-in, seeking, speed changes, bar navigation, real-time events
- Mixer-ready: Designed for solo/mute, routing, and level monitoring workflows
Installation
npm install audio-mixer-engineAudio Engine Options
- SpessaSynthAudioEngine: Soundfont-based synthesis, requires separate soundfont file (~10-50MB)
- LightweightAudioEngine: Sample-based with smaller bundle (~574KB). See LIGHTWEIGHT_ENGINE.md for setup.
Quick Start
import { SpessaSynthAudioEngine, PlaybackManager } from 'audio-mixer-engine';
const ac = new AudioContext();
// Initialize audio engine
const audioEngine = new SpessaSynthAudioEngine(ac);
await audioEngine.initialize('/path/to/soundfont.sf2');
// Create PlaybackManager
const manager = new PlaybackManager(audioEngine, {
metronome: { enabled: true, volume: 0.7 },
leadIn: { enabled: true, bars: 1 },
startup: { delayMs: 25 }
});
// Load MIDI file
await manager.load(midiArrayBuffer);
// Set up event listeners
manager.on('timeupdate', ({currentTime}) => console.log(currentTime));
manager.on('beatChanged', ({bar, beat}) => console.log(`Bar ${bar}, Beat ${beat}`));
// Setup audio routing (REQUIRED for audio output)
// Metronome
const metGain = ac.createGain();
metGain.gain.value = 0.5;
manager.getMetronomeOutput().connect(metGain).connect(ac.destination);
// Part outputs
for (const [partName, outputNode] of manager.getPartOutputs()) {
const gain = ac.createGain();
gain.gain.value = 0.75;
outputNode.connect(gain).connect(ac.destination);
}
// Start playback
await manager.play({ leadIn: true, metronome: true });Key Concepts
Part-centric design: Each musical part gets an independent ChannelHandle with its own AudioNode output. Your application connects these outputs to gain controls, analyzers, and effects.
External routing required: The library provides individual part outputs but doesn't mix them internally. You must connect part outputs to AudioContext.destination (see Quick Start example).
Dual volume control:
- Internal:
channelHandle.setVolume()- affects MIDI synthesis - External: Your gain nodes - affects final output and enables solo/mute
PlaybackManager vs MidiPlayer: PlaybackManager adds metronome, lead-in, and convenience methods. MidiPlayer provides lower-level playback control.
API Overview
PlaybackManager
// Lifecycle
await manager.load(midiArrayBuffer, metadata, instrumentMap);
manager.reset();
// Transport
await manager.play({ leadIn: true, metronome: true });
manager.pause();
manager.resume();
manager.stop();
// Configuration
manager.setMetronomeEnabled(true);
manager.setMetronomeSettings({ volume: 0.8 });
manager.setLeadInEnabled(true);
manager.setLeadInBars(2);
manager.setStartupDelay(50);
// Audio routing
const metronomeOutput = manager.getMetronomeOutput();
for (const [partName, outputNode] of manager.getPartOutputs()) { /* ... */ }
const partNames = manager.getPartNames();
// State
const state = manager.getState(); // 'reset'|'ready'|'stopped'|'lead-in'|'playing'|'paused'
const isPlaying = manager.isPlaying();
const currentTime = manager.getCurrentTime();
// Preview next notes (pitch reference for singers)
const result = manager.previewNextNotes({
instrument: 'piano',
delayBetweenParts: 0.4,
duration: 0.6,
velocity: 110
});
// Events
manager.on('timeupdate', ({currentTime}) => {});
manager.on('beatChanged', ({bar, beat, isLeadIn, time}) => {});
manager.on('leadInStarted', ({bars, totalBeats, startupDelayMs}) => {});
manager.on('leadInEnded', () => {});MidiPlayer
// Transport
player.play();
player.pause();
player.stop();
player.skipToTime(30.5);
player.setPlaybackSpeed(1.5);
// Musical navigation
player.setBar(4, 1); // Jump to bar 4, repeat 1
const time = player.getTimeFromBar(2);
const barInfo = player.getBeatFromTime(15.3);
// Part access
const outputNode = player.getPartOutput('soprano');
const channelHandle = player.getPartChannel('soprano');
// Controls
player.setMasterVolume(0.7);
player.allSoundsOff();
// Events
player.on('timeupdate', ({currentTime}) => {});
player.on('ended', ({finalTime}) => {});
player.on('barChanged', ({bar, beat, repeat, time}) => {});ChannelHandle
// Output routing
const outputNode = channelHandle.getOutputNode();
// Note control
channelHandle.noteOn(pitch, velocity);
channelHandle.noteOff(pitch);
channelHandle.allNotesOff();
// Scheduled notes
const eventId = channelHandle.playNote(startTime, pitch, velocity, duration);
// Configuration
await channelHandle.setInstrument('choir_aahs'); // String name or MIDI program number
channelHandle.setVolume(0.8); // Internal MIDI volume
// Info
const partId = channelHandle.getPartId();
const isActive = channelHandle.isActive();AudioEngine
// Initialization
await audioEngine.initialize('/path/to/soundfont.sf2');
const isReady = audioEngine.isInitialized;
// Progress tracking (SpessaSynth only)
audioEngine.on('initProgress', ({stage, message, progress}) => {
// Show progress bar for soundfont downloads (20+ seconds)
});
// Channel creation
const channelHandle = audioEngine.createChannel('partId', {
instrument: 'piano',
initialVolume: 1.0
});
// Controls
audioEngine.setMasterVolume(0.7);
audioEngine.allSoundsOff();
audioEngine.destroy();Mixer Example
// Typical mixer integration
class AudioMixer {
constructor(audioContext) {
this.audioContext = audioContext;
this.parts = new Map(); // partId -> { gain, analyzer }
this.masterGain = audioContext.createGain();
this.masterGain.connect(audioContext.destination);
}
async loadMidiFile(manager, midiBuffer) {
await manager.load(midiBuffer);
// Set up routing for each part
for (const [partName, outputNode] of manager.getPartOutputs()) {
const gain = this.audioContext.createGain();
const analyzer = this.audioContext.createAnalyser();
outputNode.connect(gain);
gain.connect(analyzer);
analyzer.connect(this.masterGain);
this.parts.set(partName, { gain, analyzer });
}
}
setPartVolume(partId, volume) {
this.parts.get(partId).gain.gain.value = volume;
}
mutePart(partId) {
this.parts.get(partId).gain.gain.value = 0;
}
getPartLevel(partId) {
const { analyzer } = this.parts.get(partId);
const dataArray = new Uint8Array(analyzer.frequencyBinCount);
analyzer.getByteFrequencyData(dataArray);
// Calculate RMS level
let sum = 0;
for (let i = 0; i < dataArray.length; i++) {
sum += dataArray[i] * dataArray[i];
}
return Math.sqrt(sum / dataArray.length) / 255;
}
}Additional Documentation
- LIGHTWEIGHT_ENGINE.md - Lightweight engine setup and sample file configuration
- METADATA.md - Score metadata, part configuration, and playback modifiers
- BEATMAPPING.md - Beat mapping and non-linear playback structures
- INTERFACE.md - Complete interface contract for UI integration
- INIT_PROGRESS.md - Initialization progress tracking and best practices
Development
npm install
npm test
npm run test:coverage
npm run dev
npm run buildExamples
demo/part-audio-engine-demo.html- Interactive browser demodemo/initialization-progress-demo.html- Progress tracking demoexamples/midi-player-demo.js- Basic usage
License
MIT License - see LICENSE file for details.
Browser Compatibility
- Chromium browsers (Chrome, Edge, Brave): May experience audio distortion due to Web Audio API bugs
- Safari: Should work with standard Web Audio API support
