@starmorph/chat-widget
v0.2.0
Published
Portable AI chat widget for Next.js apps. Supports OpenAI, Anthropic, and Google Gemini via Vercel AI SDK.
Readme
@starmorph/chat-widget
A portable AI chat widget for Next.js apps. Drops a floating "Chat" button into any page that opens a chat window powered by the Vercel AI SDK. Supports any AI provider — OpenAI, Anthropic, Google Gemini, and more.
Install
pnpm add @starmorph/chat-widget ai @ai-sdk/reactThen install the AI provider you want to use:
# Pick one (or multiple)
pnpm add @ai-sdk/anthropic # Claude
pnpm add @ai-sdk/openai # GPT
pnpm add @ai-sdk/google # GeminiSetup
1. Add the widget to your layout
// app/layout.tsx
import { ChatWidget } from "@starmorph/chat-widget";
export default function Layout({ children }) {
return (
<html>
<body>
{children}
<ChatWidget
title="Support Assistant"
placeholder="Ask a question..."
buttonLabel="Chat"
/>
</body>
</html>
);
}2. Create the API route
// app/api/chat/route.ts
import { createChatHandler } from "@starmorph/chat-widget/server";
import { anthropic } from "@ai-sdk/anthropic";
export const maxDuration = 30;
export const POST = createChatHandler({
model: anthropic("claude-haiku-4-5-20251001"),
appName: "My App",
systemPrompt: "You are a helpful assistant for My App. Be concise.",
onLog: (log) => {
console.log("[chat]", JSON.stringify(log));
},
});3. Add your API key
# .env.local
ANTHROPIC_API_KEY=sk-ant-...
# or OPENAI_API_KEY=sk-...
# or GOOGLE_GENERATIVE_AI_API_KEY=AIza...The Vercel AI SDK reads these automatically — no extra config needed.
4. Import styles (skip if you use shadcn/ui)
The widget uses shadcn/ui semantic color classes (bg-background, bg-primary, etc.). If your app already has shadcn/ui configured, skip this step — your CSS variables are already defined.
Otherwise, import the bundled fallback styles:
// app/layout.tsx (or globals.css)
import "@starmorph/chat-widget/styles";This adds light + dark theme CSS variables with zero specificity (:where()), so any existing variables in your app automatically take precedence.
5. Configure Tailwind
The widget uses Tailwind classes. Add the package to your Tailwind content paths so the classes get compiled.
Tailwind 4 (CSS @source directive):
/* app/globals.css */
@import "tailwindcss";
@source "../node_modules/@starmorph/chat-widget/dist";Tailwind 3 (config file):
// tailwind.config.ts
content: [
"./app/**/*.{ts,tsx}",
"./node_modules/@starmorph/chat-widget/dist/**/*.{js,mjs}",
],How it works
Architecture
The package exports two things from two entry points:
| Import | What | Runs on |
|--------|------|---------|
| @starmorph/chat-widget | <ChatWidget> component | Client ("use client") |
| @starmorph/chat-widget/server | createChatHandler() | Server (API route) |
The client component uses useChat() from @ai-sdk/react to stream messages to/from your /api/chat endpoint. The server handler uses streamText() from the ai package to call the model and stream the response back.
Browser Server
┌─────────────┐ POST /api/chat ┌──────────────────┐ ┌──────────┐
│ ChatWidget │ ───────────────> │ createChatHandler │ ──> │ AI Model │
│ (useChat) │ <─── stream ──── │ (streamText) │ <── │ │
└─────────────┘ └──────────────────┘ └──────────┘Context via system prompt
The AI assistant knows about your app through the systemPrompt config. This is sent as the system message on every request, giving the model context about what it's helping with. For example:
createChatHandler({
model: anthropic("claude-haiku-4-5-20251001"),
systemPrompt:
"You are a helpful assistant for Mermaid Editor, a free online diagram designer. " +
"Help users create flowcharts, sequence diagrams, and more. " +
"Provide Mermaid syntax examples when helpful.",
});The model receives this system prompt before every user message, so it always has the right context for your app.
Model selection
You pass a model instance directly — the widget doesn't import any provider packages itself. This means:
- No unused providers in your bundle
- No Turbopack/webpack issues with unresolved imports
- You control exactly which model and version is used
- Switching providers is a one-line change
// Anthropic
import { anthropic } from "@ai-sdk/anthropic";
createChatHandler({ model: anthropic("claude-haiku-4-5-20251001") });
// OpenAI
import { openai } from "@ai-sdk/openai";
createChatHandler({ model: openai("gpt-4o-mini") });
// Google
import { google } from "@ai-sdk/google";
createChatHandler({ model: google("gemini-2.0-flash") });Conversation logging
Every completed exchange is logged via the onLog callback. Each log entry contains:
interface ChatLog {
timestamp: string; // ISO 8601
appName: string; // Identifies which app (e.g. "Mermaid Editor")
userMessage: string; // What the user sent
assistantResponse: string; // What the model replied
model: string; // Model ID from the response (e.g. "claude-haiku-4-5-20251001")
}Currently: Logs are sent to console.log on the server. You'll see them in your terminal / Vercel function logs.
Future: Replace the callback with a database insert to persist conversations:
onLog: async (log) => {
await supabase.from("chat_logs").insert(log);
},The appName field lets you filter logs by which app they came from when multiple apps share the same database.
ChatWidget props
| Prop | Type | Default | Description |
|------|------|---------|-------------|
| apiEndpoint | string | "/api/chat" | Chat API endpoint |
| position | "bottom-right" \| "bottom-left" | "bottom-right" | Button position |
| buttonLabel | string | "Chat" | Text on the floating button |
| title | string | "Chat" | Chat window header title |
| placeholder | string | "Type a message..." | Input placeholder |
| defaultOpen | boolean | false | Start with window open |
| className | string | — | Override button styles |
createChatHandler config
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| model | LanguageModel | required | AI model instance |
| systemPrompt | string | "You are a helpful assistant." | System prompt for context |
| maxOutputTokens | number | — | Max response tokens |
| appName | string | "Unknown" | App identifier for logs |
| onLog | (log: ChatLog) => void | — | Callback for conversation logging |
How it's built
- tsup bundles two entry points: client (
"use client"banner) and server (no banner) - ESM + CJS output with full TypeScript declarations
- Ships only
dist/— no source code, no env files, no dev dependencies - Peer dependencies:
react,react-dom,ai,@ai-sdk/react - Direct dependencies:
clsx,tailwind-merge,lucide-react,react-markdown
dist/
├── index.mjs # Client components (ESM, "use client")
├── index.cjs # Client components (CJS)
├── index.d.ts # Client type declarations
├── create-chat-handler.mjs # Server handler (ESM)
├── create-chat-handler.cjs # Server handler (CJS)
└── create-chat-handler.d.ts # Server type declarationsUI features
- Floating button with chat icon (bottom-right by default)
- Button stays fixed in place when chat window opens (no layout shift)
- Swaps to X icon when open, Escape key closes
- Markdown rendering in assistant messages (headings, bold, lists, code blocks, links, blockquotes)
- Auto-scroll to latest message
- Loading spinner while waiting for response
- Error state display
- Enter to send, Shift+Enter for newline
- Auto-growing textarea input
License
MIT
