webxr-volume
v0.1.0
Published
WebXR Volumes polyfill: bounded, composable 3D sessions that work in any WebView
Maintainers
Readme
webxr-volume
A polyfill that adds bounded 3D regions to WebXR, similar to visionOS volumes but for the open web. Each volume is a composable 3D session that runs inside any WebView, so native AR runtimes like Lens Studio or Unity can host web-based spatial content with head tracking and input.
Install
npm install webxr-volumeQuick Start
Works with Three.js out of the box:
import { createVolumeSession, VolumeDevice } from 'webxr-volume';
import * as THREE from 'three';
const canvas = document.getElementById('canvas');
const device = new VolumeDevice({ width: 0.3, height: 0.3, depth: 0.3 });
const renderer = new THREE.WebGLRenderer({ canvas, antialias: true });
renderer.xr.enabled = true;
const session = await createVolumeSession(canvas, device);
await renderer.xr.setSession(session);
// Host injects head pose each frame
device.setViewerPose(
{ x: 0, y: 0.1, z: 0.5 },
{ x: 0, y: 0, z: 0, w: 1 }
);API
VolumeDevice(bounds?)
Creates the volume. Default 30cm cube.
device.setViewerPose(position, orientation?)- inject head tracking from a native AR runtimedevice.setBounds(width, height, depth)- resize the volume in metersdevice.addInputSource(config)- register an input source (gaze, pointer, screen)device.dispatchInput(sourceId, type)- fire select/selectstart/selectend events
createVolumeSession(canvas, device, options?)
Returns an XRSession-compatible object. Three.js, Babylon, and raw WebGL all work. Touch and mouse input auto-map to XR input sources on the canvas.
Options: { features?: string[], depthNear?: number, depthFar?: number, bounds?: {width, height, depth} }
Relay
The relay routes WebSocket messages between a native host publishing head pose and a WebView subscribing to render the volume. Useful when the host and WebView can't communicate directly (for example, a Lens Studio script talking to a WebView on Spectacles). It's a Cloudflare Worker with Durable Objects, one instance per channel.
Deploy your own:
cd relay && npx wrangler deployConnect with ?channel=your-channel. Publishers send pose and bounds events, subscribers receive them. New subscribers get the last known pose replayed on connect.
How It Works
The polyfill creates a virtual XRSession backed by a VolumeDevice instead of real XR hardware. The device is the host's control surface: it accepts pose, bounds, and input, and the session reads from it each frame. Three.js and other WebXR renderers treat it like any other XR session.
License
MIT
