@viji-dev/core
v0.2.20
Published
Universal execution engine for Viji Creative scenes
Downloads
73
Maintainers
Readme
Viji Core Package (@viji-dev/core)
Universal execution engine for Viji Creative scenes
A powerful, secure, and feature-rich JavaScript/TypeScript library that provides the foundation for creative scene execution across all Viji platform contexts. The core offers identical IFrame + WebWorker execution with comprehensive parameter management, audio/video analysis, user interaction handling, and performance optimization.
🚀 Features
✅ Core Execution Engine
- Secure IFrame + WebWorker Architecture: Complete isolation with controlled communication
- Multi-Instance Support: Concurrent instances for main scenes and previews
- Automatic Resource Management: Memory leak prevention and cleanup
✅ Parameter System
- Declarative Parameter Definition: Define parameters once with automatic UI generation
- Proxy-Based Access: Fast parameter access in render loops
- Category-Based Organization: Audio, video, interaction, and general parameters
- Real-time Validation: Type safety and range checking
- Capability-Aware UI: Parameters shown based on active features
✅ Audio Analysis
- Real-time Audio Processing: Volume, frequency analysis, and beat detection
- Custom Frequency Bands: Bass, mid, treble, and custom band analysis
- Multiple Input Sources: Microphone, audio files, and screen capture
- Audio-Reactive Scenes: Make scenes respond to audio input
✅ Video Analysis
- Real-time Video Processing: Frame analysis in separate WebWorker
- Multiple Input Sources: Camera, video files, and screen capture
- Video-Reactive Scenes: Make scenes respond to video motion and brightness
- Frame Data Access: Raw video frame data for custom analysis
✅ User Interaction
- Mouse Tracking: Position, buttons, movement, and scroll wheel
- Keyboard Input: Key states, modifiers, and event handling
- Touch Support: Multi-touch with gesture detection
- Canvas-Coordinate Mapping: Accurate input positioning
✅ Performance Optimization
- Configurable Frame Rates: Full (60fps) or half (30fps) modes
- Resolution Scaling: Fractional or explicit canvas dimensions
- Adaptive Performance: Automatic optimization based on hardware
- Memory Management: Efficient resource pooling and cleanup
📦 Installation
npm install @viji-dev/core🎯 Quick Start
Basic Scene Creation
import { VijiCore } from '@viji-dev/core';
// Artist scene code
const sceneCode = `
// Define parameters using helper functions
const color = viji.color('#ff6b6b', {
label: 'Shape Color',
description: 'Color of the animated shape',
group: 'appearance'
});
const size = viji.slider(50, {
min: 10,
max: 150,
step: 5,
label: 'Shape Size',
description: 'Size of the animated shape',
group: 'appearance'
});
const speed = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Animation Speed',
description: 'Speed of the animation',
group: 'animation'
});
// Main render function
function render(viji) {
const ctx = viji.useContext('2d');
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
// Animated shape
const time = viji.time * speed.value;
const x = viji.width / 2 + Math.sin(time) * 100;
const y = viji.height / 2 + Math.cos(time) * 100;
ctx.fillStyle = color.value;
ctx.beginPath();
ctx.arc(x, y, size.value / 2, 0, Math.PI * 2);
ctx.fill();
}
`;
// Create core instance
const core = new VijiCore({
hostContainer: document.getElementById('scene-container'),
sceneCode: sceneCode,
frameRateMode: 'full',
allowUserInteraction: true
});
// Initialize and start rendering
await core.initialize();
console.log('Scene is running!');🔧 Integration API
Core Configuration
The VijiCoreConfig interface defines all available configuration options:
interface VijiCoreConfig {
// Required configuration
hostContainer: HTMLElement; // Container element for the scene
sceneCode: string; // Artist JavaScript code with render function
// Performance configuration
frameRateMode?: 'full' | 'half'; // 'full' = 60fps, 'half' = 30fps
autoOptimize?: boolean; // Enable automatic performance optimization
// Input streams
audioStream?: MediaStream; // Audio input for analysis
videoStream?: MediaStream; // Video input for analysis
// Audio analysis configuration
analysisConfig?: {
fftSize?: number; // FFT size for frequency analysis (default: 2048)
smoothing?: number; // Smoothing factor 0-1 (default: 0.8)
frequencyBands?: FrequencyBand[]; // Custom frequency bands
beatDetection?: boolean; // Enable beat detection
onsetDetection?: boolean; // Enable onset detection
};
// Parameter system
parameters?: ParameterGroup[]; // Initial parameter values
// Feature toggles
noInputs?: boolean; // Disable all input processing
allowUserInteraction?: boolean; // Enable mouse/keyboard/touch events
}Instance Management
Creation and Initialization
// Create core instance
const core = new VijiCore({
hostContainer: document.getElementById('scene-container'),
sceneCode: sceneCode,
frameRateMode: 'full',
allowUserInteraction: true
});
// Initialize the core (required before use)
await core.initialize();
// Check if core is ready for operations
if (core.ready) {
console.log('Core is ready for use');
}
// Get current configuration
const config = core.configuration;
console.log('Current frame rate mode:', config.frameRateMode);Performance Control
// Frame rate control
await core.setFrameRate('full'); // Set to 60fps mode
await core.setFrameRate('half'); // Set to 30fps mode
// Resolution control
await core.setResolution(0.75); // Set to 75% of container size
await core.setResolution(0.5); // Set to 50% for performance
await core.updateResolution(); // Auto-detect container size changes
// Get performance statistics
const stats = core.getStats();
console.log('Current FPS:', stats.frameRate.effectiveRefreshRate);
console.log('Canvas size:', stats.resolution);
console.log('Scale factor:', stats.scale);
console.log('Parameter count:', stats.parameterCount);Debug and Development
// Enable debug logging
core.setDebugMode(true);
// Check debug mode status
const isDebugEnabled = core.getDebugMode();
// Debug mode provides detailed logging for:
// - Initialization process
// - Communication between components
// - Parameter system operations
// - Audio/video stream processing
// - Performance statisticsParameter Management
The parameter system provides a powerful way to create interactive scenes with automatic UI generation.
Parameter Definition and Access
// Listen for parameter definitions from artist code
core.onParametersDefined((groups) => {
console.log('Parameters available:', groups);
// Each group contains:
// - groupName: string
// - category: 'audio' | 'video' | 'interaction' | 'general'
// - description: string
// - parameters: Record<string, ParameterDefinition>
// Generate UI based on parameter groups
generateParameterUI(groups);
});
// Set individual parameter values
await core.setParameter('color', '#ff0000');
await core.setParameter('size', 75);
await core.setParameter('enabled', true);
// Set multiple parameters efficiently
await core.setParameters({
'color': '#00ff00',
'size': 100,
'speed': 2.0,
'enabled': false
});
// Get current parameter values
const values = core.getParameterValues();
const color = core.getParameter('color');
// Listen for parameter changes
core.onParameterChange('size', (value) => {
console.log('Size parameter changed to:', value);
});
// Listen for parameter errors
core.onParameterError((error) => {
console.error('Parameter error:', error.message);
console.error('Error code:', error.code);
});Capability-Aware Parameters
// Get all parameter groups (unfiltered, use for saving scene parameters)
const allGroups = core.getAllParameterGroups();
// Get parameter groups filtered by active capabilities (for UI)
const visibleGroups = core.getVisibleParameterGroups();
// Check current capabilities
const capabilities = core.getCapabilities();
console.log('Audio available:', capabilities.hasAudio);
console.log('Video available:', capabilities.hasVideo);
console.log('Interaction enabled:', capabilities.hasInteraction);
// Check if specific parameter category is active
const isAudioActive = core.isCategoryActive('audio');
const isVideoActive = core.isCategoryActive('video');
// Parameters are automatically categorized:
// - 'audio': Only shown when audio stream is connected
// - 'video': Only shown when video stream is connected
// - 'interaction': Only shown when user interaction is enabled
// - 'general': Always availableAudio and Video Integration
Audio Stream Management
// Set audio stream for analysis
const audioStream = await navigator.mediaDevices.getUserMedia({
audio: {
echoCancellation: false,
noiseSuppression: false,
autoGainControl: false
}
});
await core.setAudioStream(audioStream);
// Configure audio analysis
await core.setAudioAnalysisConfig({
fftSize: 2048, // Higher values = better frequency resolution
smoothing: 0.8 // 0 = no smoothing, 1 = maximum smoothing
});
// Get current audio stream
const currentStream = core.getAudioStream();
// Disconnect audio
await core.setAudioStream(null);Video Stream Management
// Set video stream for analysis
const videoStream = await navigator.mediaDevices.getUserMedia({
video: {
width: { ideal: 640 },
height: { ideal: 480 },
frameRate: { ideal: 30 }
}
});
await core.setVideoStream(videoStream);
// Video analysis includes:
// - Real-time frame processing
// - Frame data access for custom analysis
// - Brightness and motion detection
// - Custom computer vision processing
// Disconnect video
await core.setVideoStream(null);Interaction Management
// Enable or disable user interactions at runtime
await core.setInteractionEnabled(true); // Enable mouse, keyboard, and touch
await core.setInteractionEnabled(false); // Disable all interactions
// Get current interaction state
const isInteractionEnabled = core.getInteractionEnabled();
// Interaction state affects:
// - Mouse, keyboard, and touch event processing
// - Parameter visibility (interaction category parameters)
// - Scene behavior that depends on user input
// Note: Interaction state is separate from initialization config
// You can toggle interactions regardless of initial allowUserInteraction value
// The interaction system is always available for runtime controlCapability Change Monitoring
// Listen for capability changes
core.onCapabilitiesChange((capabilities) => {
console.log('Capabilities updated:', capabilities);
// Update UI based on new capabilities
if (capabilities.hasAudio) {
showAudioControls();
} else {
hideAudioControls();
}
if (capabilities.hasVideo) {
showVideoControls();
} else {
hideVideoControls();
}
if (capabilities.hasInteraction) {
showInteractionControls();
} else {
hideInteractionControls();
}
});Event Handling and Lifecycle
Core Lifecycle Events
// Core is ready for operations
if (core.ready) {
// All systems initialized and running
console.log('Core is fully operational');
}
// Check if parameters are initialized
if (core.parametersReady) {
// Parameter system is ready
console.log('Parameters are available');
}Cleanup and Resource Management
// Destroy instance and clean up all resources
await core.destroy();
// This automatically:
// - Stops all rendering loops
// - Disconnects audio/video streams
// - Cleans up WebWorker and IFrame
// - Releases all event listeners
// - Clears parameter system
// - Frees memory resources🎨 Artist API
The artist API provides a comprehensive set of tools for creating interactive, audio-reactive, and video-responsive scenes.
Canvas and Rendering
function render(viji) {
// Get canvas contexts
const ctx = viji.useContext('2d'); // 2D rendering context
const gl = viji.useContext('webgl'); // WebGL rendering context
// Canvas properties
viji.canvas; // OffscreenCanvas object
viji.width; // Canvas width in pixels
viji.height; // Canvas height in pixels
viji.pixelRatio; // Device pixel ratio for crisp rendering
// Example: Draw a responsive circle
const centerX = viji.width / 2;
const centerY = viji.height / 2;
const radius = Math.min(viji.width, viji.height) * 0.1;
ctx.fillStyle = '#ff6b6b';
ctx.beginPath();
ctx.arc(centerX, centerY, radius, 0, Math.PI * 2);
ctx.fill();
}Timing Information
The timing system provides FPS-independent timing data for smooth animations:
function render(viji) {
// Timing data (FPS independent)
viji.time; // Elapsed time in seconds since scene start
viji.deltaTime; // Time since last frame in seconds
viji.frameCount; // Total number of frames rendered
viji.fps; // Current frames per second
// Example: Smooth animation regardless of frame rate
const animationSpeed = 2.0; // rotations per second
const rotation = (viji.time * animationSpeed * Math.PI * 2) % (Math.PI * 2);
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.rotate(rotation);
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
}Parameter System
The parameter system allows artists to define interactive parameters that automatically generate UI controls.
Parameter Definition
// Define parameters (call once outside render loop)
const color = viji.color('#ff6b6b', {
label: 'Primary Color',
description: 'Main color for shapes',
group: 'appearance',
category: 'general'
});
const size = viji.slider(50, {
min: 10,
max: 150,
step: 5,
label: 'Shape Size',
description: 'Size of shapes in pixels',
group: 'appearance',
category: 'general'
});
const speed = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Animation Speed',
description: 'Speed of animation in rotations per second',
group: 'animation',
category: 'general'
});
const useAudio = viji.toggle(false, {
label: 'Audio Reactive',
description: 'Make shapes react to audio input',
group: 'audio',
category: 'audio'
});
const shapeType = viji.select('circle', {
options: ['circle', 'square', 'triangle', 'star'],
label: 'Shape Type',
description: 'Type of shape to draw',
group: 'appearance',
category: 'general'
});
const title = viji.text('My Scene', {
label: 'Scene Title',
description: 'Title displayed in the scene',
group: 'text',
category: 'general',
maxLength: 50
});
const particleCount = viji.number(5, {
min: 1,
max: 20,
step: 1,
label: 'Particle Count',
description: 'Number of particles to render',
group: 'animation',
category: 'general'
});Parameter Usage in Render Loop
function render(viji) {
const ctx = viji.useContext('2d');
// Fast parameter access (proxy-based)
ctx.fillStyle = color.value; // Get current color value
const radius = size.value / 2; // Get current size value
const animationSpeed = speed.value; // Get current speed value
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
// Draw title
ctx.fillStyle = 'white';
ctx.font = '20px Arial';
ctx.textAlign = 'center';
ctx.fillText(title.value, viji.width / 2, 30);
// Draw particles
for (let i = 0; i < particleCount.value; i++) {
const angle = (i / particleCount.value) * Math.PI * 2 + (viji.time * animationSpeed);
const x = viji.width / 2 + Math.cos(angle) * 100;
const y = viji.height / 2 + Math.sin(angle) * 100;
ctx.fillStyle = color.value;
ctx.beginPath();
switch (shapeType.value) {
case 'circle':
ctx.arc(x, y, radius, 0, Math.PI * 2);
break;
case 'square':
ctx.rect(x - radius, y - radius, radius * 2, radius * 2);
break;
case 'triangle':
ctx.moveTo(x, y - radius);
ctx.lineTo(x - radius, y + radius);
ctx.lineTo(x + radius, y + radius);
ctx.closePath();
break;
}
ctx.fill();
}
}Audio Analysis
The audio system provides real-time analysis of audio input with comprehensive frequency and volume data.
Audio API Overview
function render(viji) {
const audio = viji.audio;
if (audio.isConnected) {
// Volume analysis
const volume = audio.volume.rms; // 0-1 RMS volume (true volume)
const peak = audio.volume.peak; // 0-1 peak volume (maximum amplitude)
// Frequency bands (0-1 values)
const bass = audio.bands.bass; // 60-250 Hz
const mid = audio.bands.mid; // 500-2000 Hz
const treble = audio.bands.treble; // 2000-20000 Hz
// Extended frequency bands
const subBass = audio.bands.subBass; // 20-60 Hz
const lowMid = audio.bands.lowMid; // 250-500 Hz
const highMid = audio.bands.highMid; // 2000-4000 Hz
const presence = audio.bands.presence; // 4000-6000 Hz
const brilliance = audio.bands.brilliance; // 6000-20000 Hz
// Beat detection
if (audio.beat?.isKick) {
// Kick drum detected
console.log('Kick detected!');
}
// Raw frequency data (0-255 values)
const frequencyData = audio.getFrequencyData();
// Example: Audio-reactive animation
const scale = 1 + (volume * 2); // Scale based on volume
const hue = (bass * 360); // Color based on bass
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.scale(scale, scale);
ctx.fillStyle = `hsl(${hue}, 70%, 60%)`;
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
}
}Audio-Reactive Scene Example
// Define audio-reactive parameters
const audioReactive = viji.toggle(true, {
label: 'Audio Reactive',
description: 'Make shapes react to audio',
group: 'audio',
category: 'audio'
});
const volumeSensitivity = viji.slider(1.0, {
min: 0.1,
max: 5.0,
step: 0.1,
label: 'Volume Sensitivity',
description: 'How sensitive shapes are to volume',
group: 'audio',
category: 'audio'
});
const bassReactivity = viji.slider(1.0, {
min: 0,
max: 3.0,
step: 0.1,
label: 'Bass Reactivity',
description: 'How much shapes react to bass',
group: 'audio',
category: 'audio'
});
function render(viji) {
const ctx = viji.useContext('2d');
const audio = viji.audio;
// Clear canvas
ctx.fillStyle = '#2c3e50';
ctx.fillRect(0, 0, viji.width, viji.height);
if (audioReactive.value && audio.isConnected) {
// Audio-reactive animation
const volume = audio.volume.rms * volumeSensitivity.value;
const bass = audio.bands.bass * bassReactivity.value;
// Scale based on volume
const scale = 1 + volume;
// Color based on bass
const hue = 200 + (bass * 160); // Blue to purple range
// Position based on frequency distribution
const x = viji.width * (audio.bands.mid + audio.bands.treble) / 2;
const y = viji.height * (1 - audio.bands.bass);
ctx.save();
ctx.translate(x, y);
ctx.scale(scale, scale);
ctx.fillStyle = `hsl(${hue}, 80%, 60%)`;
ctx.beginPath();
ctx.arc(0, 0, 30, 0, Math.PI * 2);
ctx.fill();
ctx.restore();
}
}Video Analysis
The video system provides real-time video frame analysis with frame data access for custom processing.
Video API Overview
function render(viji) {
const video = viji.video;
if (video.isConnected) {
// Video properties
const frameWidth = video.frameWidth;
const frameHeight = video.frameHeight;
const frameRate = video.frameRate;
// Current video frame (OffscreenCanvas)
if (video.currentFrame) {
// Draw video frame as background
ctx.globalAlpha = 0.3;
ctx.drawImage(video.currentFrame, 0, 0, viji.width, viji.height);
ctx.globalAlpha = 1.0;
}
// Frame data for custom analysis
const frameData = video.getFrameData();
// Example: Custom video analysis
if (frameData) {
// Access raw pixel data for custom processing
const imageData = frameData.data;
const width = frameData.width;
const height = frameData.height;
// Example: Calculate average brightness
let totalBrightness = 0;
for (let i = 0; i < imageData.length; i += 4) {
const r = imageData[i];
const g = imageData[i + 1];
const b = imageData[i + 2];
totalBrightness += (r + g + b) / 3;
}
const averageBrightness = totalBrightness / (imageData.length / 4);
// Use brightness for effects
const brightness = averageBrightness / 255; // Normalize to 0-1
// Create brightness-reactive animation
ctx.fillStyle = `rgba(255, 255, 255, ${brightness * 0.5})`;
ctx.fillRect(0, 0, viji.width, viji.height);
}
}
}Video-Reactive Scene Example
// Define video-reactive parameters
const videoReactive = viji.toggle(true, {
label: 'Video Reactive',
description: 'Make shapes react to video',
group: 'video',
category: 'video'
});
const motionSensitivity = viji.slider(1.0, {
min: 0.1,
max: 3.0,
step: 0.1,
label: 'Motion Sensitivity',
description: 'How sensitive shapes are to video changes',
group: 'video',
category: 'video'
});
function render(viji) {
const ctx = viji.useContext('2d');
const video = viji.video;
if (videoReactive.value && video.isConnected) {
// Video-reactive animation using frame data
const frameData = video.getFrameData();
if (frameData) {
// Simple motion detection (compare with previous frame)
// This is a basic example - you can implement more sophisticated analysis
const imageData = frameData.data;
let motionEnergy = 0;
// Calculate motion energy (simplified)
for (let i = 0; i < imageData.length; i += 4) {
const brightness = (imageData[i] + imageData[i + 1] + imageData[i + 2]) / 3;
motionEnergy += brightness;
}
const normalizedMotion = (motionEnergy / (imageData.length / 4)) / 255;
const scale = 1 + (normalizedMotion * motionSensitivity.value);
// Create motion-reactive shapes
ctx.save();
ctx.translate(viji.width / 2, viji.height / 2);
ctx.scale(scale, scale);
ctx.fillStyle = `hsl(${normalizedMotion * 360}, 70%, 60%)`;
ctx.beginPath();
ctx.arc(0, 0, 30, 0, Math.PI * 2);
ctx.fill();
ctx.restore();
}
}
}User Interaction
The interaction system provides comprehensive support for mouse, keyboard, and touch input.
Mouse Interaction
function render(viji) {
const mouse = viji.mouse;
// Mouse position (canvas coordinates)
if (mouse.isInCanvas) {
const x = mouse.x; // Current X coordinate
const y = mouse.y; // Current Y coordinate
// Mouse movement
const deltaX = mouse.deltaX; // X movement since last frame
const deltaY = mouse.deltaY; // Y movement since last frame
const velocity = mouse.velocity; // Smoothed velocity { x, y }
// Mouse buttons
const isPressed = mouse.isPressed; // Any button currently pressed
const leftButton = mouse.leftButton; // Left button state
const rightButton = mouse.rightButton; // Right button state
const middleButton = mouse.middleButton; // Middle button state
// Frame-based events
const wasPressed = mouse.wasPressed; // Button was pressed this frame
const wasReleased = mouse.wasReleased; // Button was released this frame
const wasMoved = mouse.wasMoved; // Mouse moved this frame
// Scroll wheel
const wheelDelta = mouse.wheelDelta; // Combined wheel delta
const wheelX = mouse.wheelX; // Horizontal wheel delta
const wheelY = mouse.wheelY; // Vertical wheel delta
// Example: Mouse-reactive animation
ctx.fillStyle = leftButton ? 'red' : 'blue';
ctx.beginPath();
ctx.arc(x, y, 20 + Math.abs(velocity.x + velocity.y), 0, Math.PI * 2);
ctx.fill();
}
}Keyboard Interaction
function render(viji) {
const keyboard = viji.keyboard;
// Key state queries
if (keyboard.isPressed('w')) {
// W key is currently pressed
console.log('W key is held down');
}
if (keyboard.wasPressed('space')) {
// Space was pressed this frame
console.log('Space was pressed!');
}
if (keyboard.wasReleased('escape')) {
// Escape was released this frame
console.log('Escape was released!');
}
// Active key tracking
const activeKeys = keyboard.activeKeys; // Set of currently pressed keys
const pressedThisFrame = keyboard.pressedThisFrame; // Set of keys pressed this frame
const releasedThisFrame = keyboard.releasedThisFrame; // Set of keys released this frame
// Modifier keys
const shift = keyboard.shift; // Shift key is held
const ctrl = keyboard.ctrl; // Ctrl key is held
const alt = keyboard.alt; // Alt key is held
const meta = keyboard.meta; // Meta/Cmd key is held
// Recent activity
const lastKeyPressed = keyboard.lastKeyPressed; // Last key that was pressed
const lastKeyReleased = keyboard.lastKeyReleased; // Last key that was released
// Example: Keyboard-controlled movement
let moveX = 0;
let moveY = 0;
if (keyboard.isPressed('w') || keyboard.isPressed('W')) moveY -= 5;
if (keyboard.isPressed('s') || keyboard.isPressed('S')) moveY += 5;
if (keyboard.isPressed('a') || keyboard.isPressed('A')) moveX -= 5;
if (keyboard.isPressed('d') || keyboard.isPressed('D')) moveX += 5;
// Apply movement
ctx.save();
ctx.translate(moveX, moveY);
ctx.fillStyle = 'green';
ctx.fillRect(0, 0, 50, 50);
ctx.restore();
}Touch Interaction
function render(viji) {
const touches = viji.touches;
// Touch points
for (const touch of touches.points) {
const x = touch.x; // Touch X coordinate
const y = touch.y; // Touch Y coordinate
const pressure = touch.pressure; // Pressure (0-1)
const radius = touch.radius; // Touch radius
const id = touch.id; // Unique touch ID
// Movement
const deltaX = touch.deltaX; // X movement since last frame
const deltaY = touch.deltaY; // Y movement since last frame
const velocity = touch.velocity; // Movement velocity { x, y }
// Lifecycle
const isNew = touch.isNew; // Touch started this frame
const isActive = touch.isActive; // Touch is currently active
const isEnding = touch.isEnding; // Touch ending this frame
// Draw touch point
ctx.fillStyle = isNew ? 'red' : isEnding ? 'yellow' : 'blue';
ctx.beginPath();
ctx.arc(x, y, radius * 2, 0, Math.PI * 2);
ctx.fill();
}
// Touch events
const started = touches.started; // Touches that started this frame
const moved = touches.moved; // Touches that moved this frame
const ended = touches.ended; // Touches that ended this frame
// Primary touch (first touch point)
const primary = touches.primary; // Primary touch point or null
// Touch gestures
const gestures = touches.gestures;
if (gestures.isPinching) {
const scale = gestures.pinchScale; // Current pinch scale
const delta = gestures.pinchDelta; // Scale change since last frame
// React to pinch gesture
ctx.save();
ctx.scale(scale, scale);
ctx.fillStyle = 'purple';
ctx.fillRect(0, 0, 100, 100);
ctx.restore();
}
if (gestures.isRotating) {
const angle = gestures.rotationAngle; // Current rotation angle
const delta = gestures.rotationDelta; // Rotation change since last frame
// React to rotation gesture
ctx.save();
ctx.rotate(angle);
ctx.fillStyle = 'orange';
ctx.fillRect(-25, -25, 50, 50);
ctx.restore();
}
if (gestures.isPanning) {
const panDelta = gestures.panDelta; // Pan movement { x, y }
// React to pan gesture
ctx.save();
ctx.translate(panDelta.x, panDelta.y);
ctx.fillStyle = 'cyan';
ctx.fillRect(0, 0, 50, 50);
ctx.restore();
}
if (gestures.isTapping) {
const tapCount = gestures.tapCount; // Number of taps
const tapPosition = gestures.tapPosition; // { x, y } tap position
// React to tap gesture
if (tapPosition) {
ctx.fillStyle = 'lime';
ctx.beginPath();
ctx.arc(tapPosition.x, tapPosition.y, 30, 0, Math.PI * 2);
ctx.fill();
}
}
}🎨 P5.js Support
Viji Core supports P5.js as an optional rendering library. P5.js provides familiar creative coding APIs while maintaining all Viji features including audio reactivity, video processing, and parameter management.
Enabling P5.js Mode
Add a single comment at the top of your scene code:
// @renderer p5
function setup(viji, p5) {
p5.colorMode(p5.HSB);
}
function render(viji, p5) {
p5.background(220);
p5.fill(255, 0, 0);
p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
}What Works
- ✅ All P5.js drawing functions (shapes, colors, transforms, typography)
- ✅ P5.js math utilities (
noise(),random(),map(),lerp()) - ✅ P5.js vectors (
p5.Vectorclass) - ✅ WebGL mode (
p5.WEBGL) - ✅ Full Viji integration (audio, video, parameters, interaction)
What Doesn't Work
- ❌ p5.dom (use Viji parameters instead)
- ❌ p5.sound (use
viji.audioinstead) - ❌ P5.js events (use
viji.mouse/keyboard/touchesinstead) - ❌ Direct image loading (use Viji image parameters instead)
Audio Reactive P5.js Example
// @renderer p5
const audioReactive = viji.toggle(true, {
label: 'Audio Reactive',
category: 'audio'
});
const bassReactivity = viji.slider(1.0, {
min: 0,
max: 3.0,
label: 'Bass Reactivity',
category: 'audio'
});
function render(viji, p5) {
p5.background(0);
if (audioReactive.value && viji.audio.isConnected) {
const bass = viji.audio.bands.bass;
const volume = viji.audio.volume.rms;
const hue = bass * 360 * bassReactivity.value;
const size = 100 + volume * 200;
p5.colorMode(p5.HSB);
p5.fill(hue, 80, 100);
p5.ellipse(p5.width / 2, p5.height / 2, size, size);
}
}Image Parameters
// @renderer p5
const bgImage = viji.image(null, {
label: 'Background Image',
group: 'media',
accept: 'image/*'
});
function render(viji, p5) {
p5.background(220);
if (bgImage.value) {
p5.image(bgImage.value, 0, 0, p5.width, p5.height);
}
// Draw on top of image
p5.fill(255, 0, 0, 128);
p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
}Getting Renderer Type
const core = new VijiCore({
hostContainer: container,
sceneCode: sceneCode
});
await core.initialize();
// Check which renderer is being used (from stats, like FPS/resolution)
const stats = core.getStats();
const rendererType = stats.rendererType; // 'native' | 'p5'Loading Image Parameters
core.onParametersDefined((groups) => {
groups.forEach(group => {
Object.entries(group.parameters).forEach(([name, def]) => {
if (def.type === 'image') {
// Create file picker
const input = createFileInput(name, def);
input.addEventListener('change', async (e) => {
const file = e.target.files[0];
if (file) {
await core.setParameter(name, file); // Unified API handles images automatically
}
});
}
});
});
});See docs/16-p5js-integration.md for comprehensive documentation including migration guides, troubleshooting, and advanced examples.
🏗️ Architecture
Security Model
The core implements a comprehensive security model to ensure safe execution of artist code:
- IFrame Isolation: Complete separation from host environment with sandboxed execution
- WebWorker Sandboxing: Artist code runs with controlled API access only
- Blob URL Creation: Secure worker and IFrame creation from blob URLs
- Resource Protection: Memory leaks and errors cannot affect main application
- Controlled Communication: Optimized message passing with validation
Performance Features
The core provides extensive performance optimization capabilities:
- Configurable Frame Rates: Full (60fps) or half (30fps) modes for performance tuning
- Resolution Scaling: Fractional (0.1-1.0) or explicit canvas dimensions
- Adaptive Optimization: Automatic performance tuning based on hardware capabilities
- Efficient Communication: Optimized message passing between components
- Memory Management: Automatic resource cleanup and memory leak prevention
Multi-Instance Support
The core supports multiple concurrent instances for complex applications:
// Main scene with full features
const mainCore = new VijiCore({
hostContainer: document.getElementById('main-scene'),
resolution: { width: 1920, height: 1080 },
frameRateMode: 'full',
allowUserInteraction: true,
audioStream: sharedAudioStream,
videoStream: sharedVideoStream
});
// Preview instance with reduced features
const previewCore = new VijiCore({
hostContainer: document.getElementById('preview'),
resolution: 0.25, // 25% resolution for performance
frameRateMode: 'half',
noInputs: true,
allowUserInteraction: false,
audioStream: sharedAudioStream // Shared efficiently across instances
});
// Thumbnail instance for gallery view
const thumbnailCore = new VijiCore({
hostContainer: document.getElementById('thumbnail'),
resolution: 0.1, // 10% resolution
frameRateMode: 'half',
noInputs: true,
allowUserInteraction: false
});
// To change scenes, create a new instance and destroy the old one
const newCore = new VijiCore({
hostContainer: document.getElementById('scene-host'),
sceneCode: newSceneCode,
audioStream: sharedAudioStream,
videoStream: sharedVideoStream
});
// Automatic comprehensive cleanup when destroyed
await oldCore.destroy();🔍 Error Handling
The core provides comprehensive error handling with detailed error information:
import { VijiCoreError } from '@viji-dev/core';
try {
const core = new VijiCore(config);
await core.initialize();
} catch (error) {
if (error instanceof VijiCoreError) {
console.error(`Core error [${error.code}]:`, error.message);
console.error('Error context:', error.context);
// Handle specific error types
switch (error.code) {
case 'INVALID_CONFIG':
console.error('Configuration is invalid:', error.context);
break;
case 'INITIALIZATION_ERROR':
console.error('Failed to initialize core:', error.context);
break;
case 'CORE_NOT_READY':
console.error('Core is not ready for operations');
break;
case 'INSTANCE_DESTROYED':
console.error('Core instance has been destroyed');
break;
case 'PARAMETERS_NOT_INITIALIZED':
console.error('Parameter system not yet initialized');
break;
case 'UNKNOWN_PARAMETER':
console.error('Parameter not found:', error.context);
break;
}
} else {
console.error('Unexpected error:', error);
}
}Common Error Codes:
INVALID_CONFIG- Configuration validation failedINITIALIZATION_ERROR- Failed to initialize core componentsCORE_NOT_READY- Operation attempted before readyINSTANCE_DESTROYED- Operation attempted after destroyPARAMETERS_NOT_INITIALIZED- Parameters not yet availableUNKNOWN_PARAMETER- Parameter not foundCONCURRENT_INITIALIZATION- Multiple initialization attemptsMANAGER_NOT_READY- Internal component not available
🧪 Development
# Install dependencies
npm install
# Build the package
npm run build
# Run tests
npm test
# Development build (watch mode)
npm run dev
# Type checking
npm run type-check
# Linting
npm run lint📚 Examples
The package includes comprehensive examples in the /example directory:
- Basic Scene Creation: Simple animated shapes with parameters
- Audio-Reactive Scenes: Scenes that respond to audio input
- Video-Reactive Scenes: Scenes that respond to video analysis
- Interactive Scenes: Mouse, keyboard, and touch interaction
- Parameter System: Complete parameter definition and UI generation
- Multi-Instance: Multiple concurrent scene instances
🎯 Use Cases
Platform Integration
The core integrates seamlessly with the Viji platform, providing scene execution while the platform handles UI, user management, and social features.
SDK Development
The core serves as the execution foundation for the Viji SDK, ensuring identical behavior between development and platform environments.
Standalone Applications
Use the core directly in custom applications for creative scene rendering with full feature support.
📄 License
Copyright (c) 2025 Artem Verkhovskiy and Dmitry Manoilenko.
All rights reserved - see the LICENSE file for details.
Contributor License Agreement
By contributing, you agree to the CLA.
Please also confirm your agreement by filling out this short form.
Viji Core - Universal execution engine for creative scenes across all contexts.
