@mirai-minds/voice
v0.0.5
Published
Voice AI Web SDK - Mirai Minds
Readme
Miraiminds Voice - Web SDK
This package lets you start Vapi calls directly in your webapp.
Installation
You can install the package via npm:
npm install @mirai-minds/voiceUsage
First, import the Vapi or Custom class from the package:
import { Vapi } from '@mirai-minds/voice';or
import { Custom } from '@mirai-minds/voice';Then, create a new instance of the Vapi or Custom class, passing your Public Key as a parameter to the constructor:
const assistant = new Vapi('your-public-key');or
const assistant = new Custom();You can start a new call by calling the start method and passing an assistant object or assistantId:
assistant.start({
model: {
provider: "openai",
model: "gpt-3.5-turbo",
messages: [
{
role: "system",
content: "You are an assistant.",
},
],
},
voice: {
provider: "11labs",
voiceId: "burt",
},
...
});assistant.start('your-assistant-id');The start method will initiate a new call.
You can override existing assistant parameters or set variables with the assistant_overrides parameter.
Assume the first message is Hey, {{name}} how are you? and you want to set the value of name to John:
const assistantOverrides = {
recordingEnabled: false,
variableValues: {
name: 'John',
},
};
assistant.start('your-assistant-id', assistantOverrides);You can send text messages to the assistant aside from the audio input using the send method and passing appropriate role and content.
assistant.send({
type: 'add-message',
message: {
role: 'system',
content: 'The user has pressed the button, say peanuts',
},
});Possible values for the role are system, user, assistant, tool or function.
You can stop the session by calling the stop method:
assistant.stop();This will stop the recording and close the connection.
The setMuted(muted: boolean) can be used to mute and un-mute the user's microphone.
assistant.isMuted(); // false
assistant.setMuted(true);
assistant.isMuted(); // trueThe say(message: string, endCallAfterSpoken?: boolean) can be used to invoke speech and gracefully terminate the call if needed
assistant.say("Our time's up, goodbye!", true);Events
You can listen to the following events:
assistant.on('speech-start', () => {
console.log('Speech has started');
});
assistant.on('speech-end', () => {
console.log('Speech has ended');
});
assistant.on('call-start', () => {
console.log('Call has started');
});
assistant.on('call-end', () => {
console.log('Call has stopped');
});
assistant.on('volume-level', (volume) => {
console.log(`Assistant volume level: ${volume}`);
});
// Function calls and transcripts will be sent via messages
assistant.on('message', (message) => {
console.log(message);
});
assistant.on('error', (e) => {
console.error(e);
});These events allow you to react to changes in the state of the call or speech.
