@omote/three
v0.3.3
Published
Three.js adapter for Omote AI character SDK
Downloads
234
Readme
@omote/three
Three.js adapter for the Omote AI Character SDK.
Install
npm install @omote/three @omote/core threeQuick Start
import * as THREE from 'three';
import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';
import { OmoteAvatar } from '@omote/three';
import { createKokoroTTS } from '@omote/core';
// Load avatar
const loader = new GLTFLoader();
loader.load('/avatar.glb', async (gltf) => {
scene.add(gltf.scene);
// Create avatar with full composition (gaze, emotion, life layer)
const avatar = new OmoteAvatar({ target: gltf.scene });
// Wire conversational voice (speaker + listener via connectVoice)
await avatar.connectVoice({
mode: 'local',
tts: createKokoroTTS({ defaultVoice: 'af_heart' }),
onTranscript: async (text) => {
const res = await fetch('/api/chat', { method: 'POST', body: text });
return await res.text();
},
});
// In render loop:
function animate() {
avatar.update(clock.getDelta(), camera);
renderer.render(scene, camera);
requestAnimationFrame(animate);
}
});API
OmoteAvatar
Full-featured avatar class with CharacterController (compositor + gaze + life layer).
| Method | Description |
|--------|-------------|
| update(delta, camera, avatarRotationY?) | Call each frame in your render loop |
| connectFrameSource(source) | Wire any pipeline (PlaybackPipeline, MicLipSync, VoiceOrchestrator) |
| disconnectFrameSource() | Disconnect the current frame source |
| setFrame(blendshapes) | Direct blendshape input |
| setEmotion(emotion) | Set emotion (string preset or weights) |
| setSpeaking(speaking) | Drive mouth animation intensity |
| setState(state) | Set conversational state (idle, listening, thinking, speaking) |
| setAudioEnergy(energy) | Set audio energy level (0-1, drives emphasis) |
| reset() | Reset all state (smoothing, life layer, emotions) |
| dispose() | Clean up resources |
Accessors: compositor, parts, hasMorphTargets, mappedBlendshapeCount
discoverScene(scene)
Traverse a Three.js scene to find bones and morph targets. Returns pre-computed index arrays for zero-lookup hot-path blendshape writing.
writeBlendshapes(blendshapes, morphEntries)
Write 52 ARKit blendshape weights to morph targets using pre-computed indices.
Voice Integration
connectVoice() combines speaker + listener + interruption handling:
import { OmoteAvatar } from '@omote/three';
import { createKokoroTTS } from '@omote/core';
const avatar = new OmoteAvatar({ target: gltf.scene });
await avatar.connectVoice({
mode: 'local',
tts: createKokoroTTS({ defaultVoice: 'af_heart' }),
interruptionEnabled: true,
onTranscript: async (text) => {
const res = await fetch('/api/chat', { body: text });
return await res.text();
},
});
// Or use individual APIs:
await avatar.connectSpeaker(ttsBackend, { lam: createA2E() });
await avatar.speak("Hello!");
await avatar.connectListener({ models: { senseVoice: {...}, vad: {...} } });
await avatar.startListening();BlendshapeController (low-level)
Direct morph target weight control for custom integrations.
License
MIT
