layershift
v0.8.0
Published
Embeddable video effects as Web Components. Drop-in parallax, depth-aware motion, and more.
Maintainers
Readme
Layershift
Embeddable video effects as Web Components. One script tag, one custom element — works in plain HTML, React, Vue, Svelte, Angular, WordPress, and anywhere else.
Layershift is a growing collection of visual effects that turn flat video into something interactive. Effects are configured via a filter-config.json file authored in the Layershift Editor.
Quick Start
<script src="https://unpkg.com/layershift"></script>
<layershift-effect
src="video.mp4"
map-src="video.lsb"
config="filter-config.json"
></layershift-effect>Effects
The <layershift-effect> element supports multiple effect types, all driven by the same depth data:
- Parallax — per-pixel UV displacement via Parallax Occlusion Mapping
- Rack Focus — animated focus shift between depth layers
- Glow — depth-driven emissive glow
- Color Grade — depth-aware color grading
- Fog / Atmosphere — depth-based atmospheric haze
- Tilt Shift — miniature/diorama lens effect
- Pop-Out — foreground subjects break out of the frame
- Freestyle — custom effect composition with manual channel configuration
Effects are configured in the Layershift Editor and exported as a filter-config.json file.
Install
Script Tag (IIFE)
<script src="https://unpkg.com/layershift"></script>npm
npm install layershiftimport 'layershift';
// <layershift-effect> is now registeredTypeScript
Add JSX type support for the custom elements:
// tsconfig.json
{ "compilerOptions": { "types": ["layershift/global"] } }Prepare Your Video
Use the Layershift Editor to:
- Import a video or image
- Run browser-based depth estimation (no CLI needed)
- Configure your effect with the visual editor
- Export a consumer package with all assets + config
The editor produces video.lsb and filter-config.json — everything <layershift-effect> needs.
Development
npm install
npm run dev # Landing page dev server (port 5173)
npm run dev:editor # Editor dev server (port 5175)
npm run docs:dev # Documentation dev server (VitePress)Build
# Build the landing page
npm run build
# Build the standalone Web Component (IIFE)
npm run build:component
# Build the npm package (ESM + IIFE + types)
npm run build:package
# Build the editor
npm run build:editor
# Build documentation
npm run build:docsComponent Library
Browse all site UI components with interactive controls and documentation.
npm run storybook # Dev server (port 6006)
npm run build:storybook # Static buildComponents follow atomic design (Atoms → Molecules → Organisms → Templates). Each component includes a story with controls and a colocated test.
<layershift-effect> Reference
Configuration
| Attribute | Type | Default | Description |
|-----------|------|---------|-------------|
| src | string | — | Video or image URL (required) |
| map-src | string | — | Depth map URL — .lsb file in LSFT format (required unless depth-model is set) |
| config | string | — | Filter config JSON URL (authored in the editor) |
| map-width | number | 512 | Width of depth map frames (optional — read from .lsb header) |
| map-height | number | 512 | Height of depth map frames (optional — read from .lsb header) |
| map-fps | number | 5 | Frame rate of precomputed depth data (optional — read from .lsb header) |
| depth-model | string | — | ONNX model URL for browser-side depth estimation |
| source-type | string | video | Source type: video, image, camera (experimental) |
| quality | string | auto | Quality tier: auto, high, medium, low |
| gpu-backend | string | auto | GPU backend: auto, webgpu, webgl2 |
| parallax-x | number | 0.4 | Horizontal parallax intensity |
| parallax-y | number | 1.0 | Vertical parallax intensity |
| parallax-max | number | 30 | Max pixel offset for nearest layer |
| layers | number | 5 | Number of parallax layers |
| overscan | number | 0.05 | Extra padding ratio |
| autoplay | boolean | true | Auto-play on mount |
| loop | boolean | true | Loop playback |
| muted | boolean | true | Muted (required for autoplay) |
Events
| Event | Detail | When |
|-------|--------|------|
| layershift-effect:ready | { videoWidth, videoHeight, duration, motionConfig } | Initialization complete |
| layershift-effect:play | { currentTime } | Video starts playing |
| layershift-effect:pause | { currentTime } | Video pauses |
| layershift-effect:loop | { loopCount } | Video loops back to start |
| layershift-effect:frame | { currentTime, frameNumber } | New video frame presented |
| layershift-effect:error | { message } | Initialization error |
const el = document.querySelector('layershift-effect');
el.addEventListener('layershift-effect:ready', (e) => {
console.log(`Video: ${e.detail.videoWidth}x${e.detail.videoHeight}`);
});Input
The component reads its position offset from the input property every frame. This decoupled design lets you use the built-in helpers or wire your own input source.
Quick Setup
Wire input on the ready event using the authored motion settings:
ESM (npm)
import { connectPointer } from 'layershift';
const el = document.querySelector('layershift-effect');
el.addEventListener('layershift-effect:ready', (e) => {
const { motionConfig } = e.detail;
const cleanup = connectPointer(el, motionConfig);
// Call cleanup() to stop input tracking
});IIFE (script tag)
const el = document.querySelector('layershift-effect');
el.addEventListener('layershift-effect:ready', (e) => {
const { motionConfig } = e.detail;
Layershift.connectPointer(el, motionConfig);
});Manual Input
Skip the helpers entirely and set el.input directly:
el.input = { x: 0.5, y: -0.3 }; // normalized -1 to 1This works with any input source — MIDI controllers, WebSocket streams, scroll position, animation timelines, or your own custom logic.
Helpers Reference
| Helper | Input Source | Best For |
|--------|-------------|----------|
| connectPointer(el, opts?) | Mouse + touch + gyro | Default — handles all devices |
| connectMouse(el, opts?) | Mouse only | Desktop-only experiences |
| connectTouch(el, opts?) | Touch drag | Touch-only experiences |
| connectGyro(el, opts?) | Device orientation | Mobile tilt control |
Each helper returns a cleanup function. Do not combine standalone helpers on the same element — use connectPointer instead.
InputOptions
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| sensitivityX | number | 1 | Horizontal multiplier |
| sensitivityY | number | 1 | Vertical multiplier |
| lerpFactor | number | 0.08 | Smoothing factor (0–1, lower = smoother) |
connectPointer(el, {
sensitivityX: 0.4, // subtle horizontal movement
sensitivityY: 1.0, // full vertical movement
lerpFactor: 0.08, // smooth interpolation
});TypeScript
import { connectPointer, connectMouse, type InputOptions } from 'layershift';
import type { EffectInput } from 'layershift';
const el = document.querySelector('layershift-effect')!;
const cleanup = connectPointer(el as HTMLElement & { input: EffectInput }, {
sensitivityX: 0.4,
});Performance
Each <layershift-effect> instance creates 1 WebGL renderer, 1 hidden <video> element, and 2 GPU textures (1 draw call per frame). The bilateral filter runs as a GPU shader pass.
| Instances | Suitability | |-----------|-------------| | 1–3 | Smooth on all modern devices including mobile | | 4–6 | Great on desktop; mobile may hit browser video decoder limits | | 8–12 | Desktop only; consider pausing off-screen instances |
Depth Data
Effects are driven by per-frame depth maps stored in .lsb files (LSFT binary format):
- Precomputed — generated by the Layershift Editor or the precompute CLI. Depth data is u8+deflate compressed (~0.5–2.7 MB per file). Dimensions, FPS, and frame count are embedded in the file header.
- Browser estimation — on-device ML inference via the
depth-modelattribute. No preprocessing needed.
Frame-Level Sync
<layershift-effect> uses requestVideoFrameCallback (RVFC) when available to sync depth updates to actual video frame presentation:
- Depth work only runs when a new frame is decoded (~24–30fps)
- Parallax input stays smooth at display refresh rate (60–120fps)
- Frame events fire at true video frame rate
- Browsers without RVFC fall back to
requestAnimationFrameautomatically
Testing
# Unit tests (Vitest)
npm test
# Unit tests in watch mode
npm run test:watch
# Storybook component tests
npm run test:storybook
# E2E tests (Playwright, requires build first)
npm run build && npm run build:component && npm run test:e2e
# All tests
npm run build && npm run build:component && npm run test:allLicense
Business Source License 1.1 — see LICENSE for details.
Change date: 2029-01-01 → Apache License 2.0
