headwai-chat-bubble
v1.0.0
Published
A Svelte-based chat bubble component with AI integration
Maintainers
Readme
Chat Bubble Svelte
The chat bubble is based on a Svelte integration for the deep-chat component, providing an easy-to-use AI chat interface for your web applications.
Features
- 🚀 Easy integration
- 🔧 Configurable via environment variables
- 💬 Support for various AI providers (LocalChat, OpenAI and more)
- 📱 Responsive design
- 🎨 Customizable UI components
- 🔄 Real-time messaging with WebSocket support
Installation
Prerequisites
- Node.js (version 16 or higher)
- npm or yarn
Setup
- Clone this repository:
git clone <your-repo-url>
cd chat-bubble- Install dependencies:
npm install- Create a
.envfile in the root directory:
cp .env.example .envConfigure your environment variables (see Configuration section)
Start the development server:
npm run devConfiguration
The deep-chat component can be configured using environment variables. Create a .env file in your project root with the following variables:
Required Environment Variables
# API Configuration
VITE_CHAT_BUBBLE_API_URL=https://your-api-endpoint.com/api/chat/completions
VITE_CHAT_BUBBLE_API_KEY=your-api-key-here
VITE_CHAT_BUBBLE_MODEL_NAME=gpt-3.5-turbo
# Connection Type (websocket, openAI, etc.)
VITE_CHAT_BUBBLE_CONNECTION_TYPE=websocketOptional Environment Variables
# Additional Headers
VITE_CHAT_BUBBLE_CONTENT_TYPE=application/json
# Request Configuration
VITE_CHAT_BUBBLE_MAX_MESSAGES=0
# UI Configuration
VITE_CHAT_BUBBLE_PLACEHOLDER_TEXT=Welcome to the chat!
VITE_CHAT_BUBBLE_DEMO_MODE=falseUsage
Basic Integration
To integrate the deep-chat component into your Svelte application:
- Install the package (if using as a standalone package):
npm install deep-chat- Import and use in your Svelte component:
<script>
import { DeepChat } from "deep-chat";
// Configuration using environment variables
const connect = {
type: import.meta.env.VITE_CHAT_BUBBLE_CONNECTION_TYPE || "websocket",
url: import.meta.env.VITE_CHAT_BUBBLE_API_URL,
headers: {
Authorization: `Bearer ${import.meta.env.VITE_CHAT_BUBBLE_API_KEY}`,
"Content-Type": import.meta.env.VITE_CHAT_BUBBLE_CONTENT_TYPE || "application/json"
},
additionalBodyProps: {
model: import.meta.env.VITE_CHAT_BUBBLE_MODEL_NAME
}
};
const requestBodyLimits = {
maxMessages: parseInt(import.meta.env.VITE_CHAT_BUBBLE_MAX_MESSAGES) || 0
};
const textInputConfig = {
placeholder: {
text: import.meta.env.VITE_CHAT_BUBBLE_PLACEHOLDER_TEXT || "Type your message..."
}
};
// Message transformation for OpenAI compatibility
const requestInterceptor = (details) => {
if (details.body && details.body.messages) {
details.body.messages = details.body.messages.map(message => ({
role: message.role === "ai" ? "assistant" : message.role,
content: message.text || message.content
}));
}
return details;
};
const responseInterceptor = (response) => {
if (response.choices && response.choices[0] && response.choices[0].message) {
const message = response.choices[0].message;
return {
text: message.content,
role: message.role === "assistant" ? "ai" : message.role
};
}
if (response.text || response.html || response.files) {
return response;
}
if (typeof response === 'string') {
return { text: response };
}
return response;
};
</script>
<deep-chat
demo={import.meta.env.VITE_CHAT_BUBBLE_DEMO_MODE === 'true'}
textInput={textInputConfig}
connect={connect}
requestBodyLimits={requestBodyLimits}
requestInterceptor={requestInterceptor}
responseInterceptor={responseInterceptor}
/>Including Chat Bubble in Customer Websites
Option 1: Build and Deploy (Complete Application)
This option bundles everything into a ready-to-use application including all interceptors and logic.
What gets bundled:
- ✅ Request/Response interceptors (OpenAI compatibility)
- ✅ Environment variable configuration
- ✅ All chat functionality
- ✅ UI components and styling
Build and deployment steps:
- Configure environment variables for production:
# Production .env file
VITE_CHAT_BUBBLE_API_URL=https://api.customer-domain.com/chat
VITE_CHAT_BUBBLE_API_KEY=prod-api-key
VITE_CHAT_BUBBLE_MODEL_NAME=gpt-4
VITE_CHAT_BUBBLE_CONNECTION_TYPE=websocket
VITE_CHAT_BUBBLE_DEMO_MODE=false- Build the widget:
npm run build:widget- Deploy the
dist-widgetfolder to your web server or CDN.
Result: Customers get a complete chat application - no additional coding required!
Option 2: Embed Chat Bubble as Script (Plug-and-Play)
After building the widget, you can include the chat component in any website as a complete widget.
Build steps:
- Build the widget:
npm run build:widget- Publish to NPM (for jsDelivr access):
npm publishCustomer integration:
<!DOCTYPE html>
<html>
<head>
<title>Customer Website</title>
<!-- Configure via global variables -->
<script>
// Runtime configuration override
window.DEEP_CHAT_CONFIG = {
apiUrl: 'https://customer-api.com/chat',
apiKey: 'customer-api-key',
modelName: 'gpt-4',
connectionType: 'websocket'
};
</script>
</head>
<body>
<!-- Your existing content -->
<!-- Chat Bubble Integration - Auto-initializes! -->
<div id="chat-bubble-container"></div>
<!-- Include from jsDelivr CDN -->
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist-widget/chat-bubble.css">
<script src="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist-widget/chat-bubble.js"></script>
</body>
</html>Alternative: Multiple chat bubbles with data attributes:
<!-- Chat bubble with specific configuration -->
<div data-chat-bubble
data-chat-bubble-api-url="https://api1.com/chat"
data-chat-bubble-api-key="key1"></div>
<!-- Another chat bubble with different config -->
<div data-chat-bubble
data-chat-bubble-api-url="https://api2.com/chat"
data-chat-bubble-api-key="key2"></div>
<script src="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist-widget/chat-bubble.js"></script>What customers get:
- ✅ Complete chat interface
- ✅ All interceptors included
- ✅ OpenAI API compatibility built-in
- ✅ Just needs API configuration
- ✅ Auto-initialization on page load
Option 3: NPM Package Integration (Developer Integration)
For developers who need full control and customization.
Build and publish steps:
- Build the library:
npm run build:lib- Publish to NPM:
npm publishCustomer installation:
# Install in customer's project
npm install @headwai/chat-bubbleFor Svelte projects:
<!-- Customer's Svelte component -->
<script>
import { ChatBubble } from '@headwai/chat-bubble';
// Customers MUST implement these interceptors themselves
const requestInterceptor = (details) => {
if (details.body && details.body.messages) {
details.body.messages = details.body.messages.map(message => ({
role: message.role === "ai" ? "assistant" : message.role,
content: message.text || message.content
}));
}
return details;
};
const responseInterceptor = (response) => {
if (response.choices && response.choices[0] && response.choices[0].message) {
const message = response.choices[0].message;
return {
text: message.content,
role: message.role === "assistant" ? "ai" : message.role
};
}
return response;
};
const connect = {
type: "websocket",
url: process.env.API_URL,
headers: {
Authorization: `Bearer ${process.env.API_KEY}`,
"Content-Type": "application/json"
},
additionalBodyProps: {
model: process.env.MODEL_NAME
}
};
</script>
<ChatBubble
connect={connect}
requestInterceptor={requestInterceptor}
responseInterceptor={responseInterceptor}
/>For vanilla JavaScript projects:
<script type="module">
import { createChatBubble } from 'https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist/index.js';
// Initialize manually with custom configuration
const chatBubble = createChatBubble(
document.getElementById('chat-container'),
{
apiUrl: 'https://api.example.com/chat',
apiKey: 'your-api-key',
// ... other props
}
);
</script>What customers need to implement:
- ❌ Request/Response interceptors (manual setup required)
- ❌ Environment variable handling
- ❌ API configuration logic
- ✅ Deep chat component only
Deployment Options Comparison
| Feature | Built App (Option 1) | Script Embed (Option 2) | NPM Package (Option 3) |
|---------|----------------------|--------------------------|-------------------------|
| Build Command | npm run build:widget | npm run build:widget + publish | npm run build:lib + publish |
| Interceptors Included | ✅ Yes, automatically | ✅ Yes, automatically | ❌ No, manual setup required |
| Environment Variables | ✅ Built-in support | ✅ Runtime configuration | ❌ Customer implements |
| OpenAI Compatibility | ✅ Ready to use | ✅ Ready to use | ❌ Customer implements |
| Setup Complexity | 🟢 Minimal (just config) | � Minimal (script tag) | �🟡 Moderate (coding required) |
| Customization | 🟡 Limited to env vars | � Limited to config object | �🟢 Full control |
| File Size | 🟡 Larger (complete app) | � Larger (complete app) | �🟢 Smaller (just component) |
| Use Case | Self-hosted deployment | CDN/jsDelivr embedding | Developers, custom integration |
| Auto-initialization | ✅ Yes | ✅ Yes | ❌ Manual setup |
Recommended Approach
For most customers: Use Option 2 (Script Embed via jsDelivr)
- ✅ No coding required
- ✅ All interceptors included
- ✅ Easy CDN access via jsDelivr
- ✅ Just configure via JavaScript object
- ✅ Ready-to-deploy solution
- ✅ Auto-initialization
For self-hosted deployments: Use Option 1 (Built Application)
- ✅ Complete control over hosting
- ✅ No external CDN dependencies
- ✅ All interceptors included
- ✅ Environment variable configuration
For developers who need customization: Use Option 3 (NPM Package)
- ✅ Full control over interceptors
- ✅ Custom API integrations
- ✅ Smaller bundle size
- ❌ More setup required
Publishing to NPM and jsDelivr Access
Publishing Steps
- Build both library and widget:
npm run build- Login to NPM (if not already logged in):
npm login- Publish the package:
npm publishjsDelivr CDN Access
Once published to NPM, your package is automatically available on jsDelivr:
For Widget/Script Embed (Options 1 & 2):
<!-- Latest version -->
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist-widget/chat-bubble.css">
<script src="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist-widget/chat-bubble.js"></script>
<!-- Specific version -->
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@headwai/[email protected]/dist-widget/chat-bubble.css">
<script src="https://cdn.jsdelivr.net/npm/@headwai/[email protected]/dist-widget/chat-bubble.js"></script>For Library/Module Import (Option 3):
<!-- ES Module -->
<script type="module">
import { ChatBubble, createChatBubble } from 'https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist/index.js';
</script>
<!-- UMD (Universal Module Definition) -->
<script src="https://cdn.jsdelivr.net/npm/@headwai/chat-bubble@latest/dist/index.umd.js"></script>File Structure After Build
dist/ # NPM package (Option 3)
├── index.js # ES module
├── index.umd.js # UMD module
└── style.css # Component styles
dist-widget/ # Standalone widget (Options 1 & 2)
├── widget.html # Demo page
├── chat-bubble.js # Complete widget with auto-init
├── chat-bubble.css # Widget styles
└── assets/ # Additional assetsEnvironment Variables Reference
| Variable | Description | Default | Required |
|----------|-------------|---------|----------|
| VITE_CHAT_BUBBLE_API_URL | The API endpoint URL | - | ✅ |
| VITE_CHAT_BUBBLE_API_KEY | API authentication key | - | ✅ |
| VITE_CHAT_BUBBLE_MODEL_NAME | AI model name to use | gpt-3.5-turbo | ✅ |
| VITE_CHAT_BUBBLE_CONNECTION_TYPE | Connection type (websocket, openAI, etc.) | websocket | ✅ |
| VITE_CHAT_BUBBLE_CONTENT_TYPE | HTTP Content-Type header | application/json | ❌ |
| VITE_CHAT_BUBBLE_MAX_MESSAGES | Maximum messages in conversation history | 0 (unlimited) | ❌ |
| VITE_CHAT_BUBBLE_PLACEHOLDER_TEXT | Chat input placeholder text | Type your message... | ❌ |
| VITE_CHAT_BUBBLE_DEMO_MODE | Enable demo mode | false | ❌ |
API Compatibility
This component is designed to work with OpenAI-compatible APIs. The request and response interceptors handle the message format transformation between deep-chat and OpenAI formats.
Supported Message Flow
- User Input → Deep Chat format
- Request Interceptor → Transforms to OpenAI format
- API Call → Your configured endpoint
- Response Interceptor → Transforms back to Deep Chat format
- Display → Rendered in chat interface
Development
Scripts
npm run dev- Start development servernpm run build- Build both library and widget for productionnpm run build:lib- Build NPM package (Option 3)npm run build:widget- Build standalone widget (Options 1 & 2)npm run build:app- Build demo applicationnpm run preview- Preview production build
Project Structure
chat-bubble/
├── src/
│ ├── App.svelte # Main chat component
│ ├── main.js # Development entry point
│ ├── lib.js # NPM package entry point
│ └── widget.js # Widget/standalone entry point
├── dist/ # NPM package build output
├── dist-widget/ # Widget build output
├── package.json # Dependencies and scripts
├── vite.config.js # Default Vite configuration
├── vite.lib.config.js # Library build configuration
├── vite.widget.config.js # Widget build configuration
├── vite.app.config.js # App build configuration
├── svelte.config.js # Svelte configuration
├── index.html # Development HTML
├── widget.html # Widget HTML template
└── README.md # This fileTroubleshooting
Common Issues
- CORS Errors: Ensure your API endpoint allows requests from your domain
- Authentication Failures: Verify your API key is correct and has proper permissions
- WebSocket Connection Issues: Check if your API supports WebSocket connections
- Environment Variables Not Loading: Ensure variables are prefixed with
VITE_CHAT_BUBBLE_
Debug Mode
Enable debug mode by setting:
VITE_CHAT_BUBBLE_DEMO_MODE=trueThis will enable demo features that can help test the component without a real API connection.
Support
For issues and questions:
- Create an issue in this repository
- Check the deep-chat documentation
- Review the API compatibility guide
