create-vapi-agent
v1.0.1
Published
Scaffold a Next.js app with a fully functional AI voice agent — Vapi, shadcn/ui, Rive AI persona, mic selector, and all the wiring. Just define your tools and go.
Downloads
169
Maintainers
Readme
create-vapi-agent
Scaffold a production-ready AI voice agent in seconds. Powered by Vapi, Next.js, and shadcn/ui.
npx create-vapi-agent my-agent
# or
pnpm dlx create-vapi-agent my-agentWhat you get
A fully-wired Next.js application with:
- Floating voice agent widget — beautiful expandable FAB with live transcript
- Animated AI persona — Rive-based avatar with 6 visual variants (command, glint, halo, mana, obsidian, opal)
- Microphone selector — device picker with permission handling
- Server-side tool routing — API endpoint that dispatches tool calls to your handlers
- Client-side tool routing — in-browser tool handlers for navigation, UI actions, etc.
- White-label ready — customizable agent name, system prompt, model, voice, and persona
- Dark mode support — theme-aware components out of the box
- TypeScript — fully typed throughout
Quick Start
1. Create your project
npx create-vapi-agent my-agentThe CLI will interactively ask you to configure:
- Agent name
- System prompt
- Model provider & model (OpenAI, Anthropic, Google, etc.)
- Voice provider & voice ID (ElevenLabs, PlayHT, Deepgram, etc.)
- Persona visual variant
Or skip prompts with defaults:
npx create-vapi-agent my-agent --yes2. Add your Vapi key
cd my-agent
# Edit .env.local and add your Vapi public keyGet your key at dashboard.vapi.ai → Settings → API Keys.
3. Add your LLM & Voice provider keys
Vapi handles all LLM and voice API calls on your behalf. You need to add the provider keys in the Vapi dashboard (not in .env.local):
- Go to dashboard.vapi.ai → Settings → Provider Keys
- Add your LLM provider key:
- Google (Gemini) → Google AI / Vertex API key
- OpenAI (GPT) → OpenAI API key
- Anthropic (Claude) → Anthropic API key
- Add your Voice provider key:
- ElevenLabs → ElevenLabs API key
- Deepgram → Deepgram API key
- PlayHT → PlayHT API key
Without provider keys in Vapi, the agent cannot make LLM or voice calls.
4. Start developing
npm run devOpen http://localhost:3000 and click the microphone button.
Project Structure
my-agent/
├── src/
│ ├── app/
│ │ ├── layout.tsx # Root layout with AgentProvider
│ │ ├── page.tsx # Demo landing page
│ │ ├── globals.css # Tailwind + CSS variables
│ │ └── api/
│ │ └── agent/
│ │ └── tools/
│ │ └── route.ts # Server-side tool endpoint
│ ├── lib/
│ │ ├── agent-config.ts # ⭐ Agent configuration (prompt, model, voice, tools)
│ │ ├── agent-tools.ts # ⭐ Server-side tool handlers
│ │ ├── agent-client-tools.ts # ⭐ Client-side tool handlers
│ │ └── utils.ts # Utility functions
│ ├── context/
│ │ └── AgentContext.tsx # React context (Vapi connection, state)
│ └── components/
│ ├── AgentWidget.tsx # Floating voice agent widget
│ ├── ai-elements/
│ │ ├── persona.tsx # Animated Rive AI persona
│ │ └── mic-selector.tsx # Microphone device selector
│ └── ui/ # shadcn/ui components
├── .env.local # Your API keys (not committed)
└── components.json # shadcn/ui configKey Files to Customize
lib/agent-config.ts
This is the brain of your agent. Configure:
SYSTEM_PROMPT— What your agent knows and how it behavesserverTools— Tool definitions that run on your serverCLIENT_TOOL_NAMES— Tools that run in the browserbuildVapiAssistantConfig()— Full Vapi assistant configuration
// Example: Add a server tool
const serverTools = [
{
type: "function" as const,
function: {
name: "lookupOrder",
description: "Look up an order by order number",
parameters: {
type: "object",
properties: {
orderNumber: { type: "string", description: "The order number" },
},
required: ["orderNumber"],
},
},
server: {
url: `${process.env.NEXT_PUBLIC_API_BASE_URL}/api/agent/tools`,
},
},
];lib/agent-tools.ts
Register server-side tool handlers:
import { toolHandlers } from "./agent-tools";
toolHandlers.set("lookupOrder", async (args, context) => {
const order = await db.orders.findUnique({
where: { number: args.orderNumber },
});
return order ? `Order ${order.number}: ${order.status}` : "Order not found";
});lib/agent-client-tools.ts
Register client-side tool handlers (these run in the browser):
import { clientToolHandlers } from "./agent-client-tools";
clientToolHandlers.set("openModal", async (args, helpers) => {
document.dispatchEvent(new CustomEvent("open-modal", { detail: args }));
return "Modal opened";
});Advanced Usage
Using a Vapi Dashboard Assistant
Instead of inline configuration, you can create an assistant in the Vapi dashboard and reference it by ID:
NEXT_PUBLIC_VAPI_ASSISTANT_ID=your-assistant-idWhen this is set, the inline config from agent-config.ts is still used as overrides.
Custom Persona Variants
The persona component supports 6 Rive-based visual variants:
command— Bold geometric animationglint— Sparkling particle effecthalo— Soft glowing auramana— Flowing energy visualizationobsidian— Dark crystalline formopal— Iridescent shifting colors
Change the variant in agent-config.ts or pass it directly to the <AIPersona /> component.
Adding Pages with Voice Navigation
The built-in navigateTo client tool lets your agent navigate users:
// In your system prompt, mention:
// "You can navigate users to different pages using the navigateTo tool"
// The tool is already registered — just add pages to your app!Tech Stack
- Next.js — React framework with App Router
- Vapi — Voice AI platform
- Rive — Real-time animations
- shadcn/ui — UI component library
- Tailwind CSS v4 — Utility-first CSS
- Radix UI — Accessible component primitives
- cmdk — Command menu
License
MIT
