@rizal_ncc/bawanachat
v1.0.13
Published
Embeddable AI chatbot widget for BAWANA learning solutions - supports React, Vue, and vanilla JavaScript
Maintainers
Readme
BAWANA Chatbot Components
Embeddable AI chatbot widgets for the BAWANA learning ecosystem. Ship the same conversational experience across vanilla JavaScript, React, and Vue with a single, framework-agnostic core.
✨ Features
- Shared core (
ChatbotCore) with adapter wrappers for Vanilla, React, and Vue - Ready-to-ship UI with widget or full-page layouts
- OpenAI integration (Responses API) with configurable model, base URL, and sampling settings
- One-line theming: tweak the primary color and the widget, launcher, and animations all stay in sync
- Inline Markdown styling (bold/italic/code) with automatic sanitization
- Lightweight topic classifier keeps prompts on scope (company/course vs. cooking/travel)
- TypeScript definitions for all public APIs
- Packaged CSS bundle (
style.css) for consistent styling
🧠 Architecture (Agents)
- ChatView: layout-agnostic core that renders messages, composer, and suggestions; emits events; never manages wrappers or ARIA.
- Layout shells:
PageLayout,WidgetLayout,DropdownLayoutown their DOM, toggles, ARIA, and open/close state while mountingChatView. - Dropdown interactions: isolated outside-click, ESC, and ARIA syncing for the dropdown shell.
- DOM builders: per-layout structure builders with shared header/message/composer partials.
Playground to sanity-check all shells: examples/layouts/index.html (uses the source modules so you can tweak layouts rapidly).
🧪 Local Testing / Examples
- Layout playground:
npm run dev -- --host --port 4173then openhttp://localhost:4173/examples/layouts/to try page/widget/dropdown shells. - Vanilla demo:
http://localhost:4173/examples/vanilla/ - React demo:
http://localhost:4173/examples/react/ - Vue demo:
http://localhost:4173/examples/vue/
If you prefer using the built bundle, swap imports in an example from ../../src/... to ../../dist/index.mjs and ../../dist/style.css.
📦 Installation
npm install @rizal_ncc/bawanachat
# or
yarn add @rizal_ncc/bawanachatYou must provide your own OpenAI API key at runtime.
🚀 Usage Cheat Sheet
Every adapter shares the same options. Import the CSS bundle once per app (@rizal_ncc/bawanachat/style.css) so the widget, dropdown, and launcher stay styled.
Vanilla JavaScript
Floating widget launcher (default)
<link rel="stylesheet" href="node_modules/@rizal_ncc/bawanachat/dist/style.css" />
<script type="module">
import { initChatbot } from "@rizal_ncc/bawanachat";
const chatbot = initChatbot({
apiKey: "sk-...",
layout: "widget",
headerTitle: "BAWANA Assistant",
headerDescription: "Tanyakan apa saja tentang solusi digital Netpolitan.",
primaryColor: "#0ea5e9",
});
chatbot.open();
</script>Inline dropdown panel
<div id="course-chat"></div>
<script type="module">
import { initChatbot } from "@rizal_ncc/bawanachat";
initChatbot({
apiKey: import.meta.env.VITE_OPENAI_API_KEY,
layout: "dropdown",
target: "#course-chat",
headerTitle: "Course Mentor",
headerDescription: "Online • siap bantu",
});
</script>React Adapter
npm install @rizal_ncc/bawanachat react react-domFloating widget (launcher + modal)
import { BawanaChatbot } from "@rizal_ncc/bawanachat/react";
import "@rizal_ncc/bawanachat/style.css";
export function App() {
return (
<BawanaChatbot
apiKey={import.meta.env.VITE_OPENAI_API_KEY}
headerTitle="BAWANA Assistant"
headerDescription="Tanyakan apa saja tentang solusi digital Netpolitan."
suggestedMessages={[
"Apa saja komponen BAWANA 3-in-1?",
"Bagaimana LXP BAWANA memanfaatkan AI?",
]}
primaryColor="#2563eb"
primaryForeground="#f8fafc"
/>
);
}Inline widget/dropdown
export function CoursePage() {
return (
<section>
{/* ...course content... */}
<BawanaChatbot
floating={false}
layout="dropdown"
className="mt-6"
apiKey={import.meta.env.VITE_OPENAI_API_KEY}
headerTitle="Course Mentor"
headerDescription="Online • siap bantu"
/>
</section>
);
}Setting floating={false} tells the adapter to render inside the JSX tree instead of creating a global launcher. You can still choose between layout="widget", "dropdown", or "page" to control how the inline version looks.
Vue 3 Adapter
npm install @rizal_ncc/bawanachat vue<script setup>
import { BawanaChatbot } from "@rizal_ncc/bawanachat/vue";
import "@rizal_ncc/bawanachat/style.css";
const apiKey = import.meta.env.VITE_OPENAI_API_KEY;
</script>
<template>
<BawanaChatbot
:api-key="apiKey"
layout="dropdown"
header-title="Course Mentor"
header-description="Online • siap bantu"
primary-color="#9333ea"
primary-foreground="#faf5ff"
/>
</template>⚙️ Configuration
| Option | Type | Default | Description |
| --- | --- | --- | --- |
| apiKey | string | – | OpenAI API key (required) |
| model | string | gpt-4o-mini | Model to use for responses |
| baseUrl | string | https://api.openai.com/v1 | Override OpenAI endpoint |
| layout | "widget" \| "page" \| "dropdown" | widget | widget creates a floating bubble, page keeps the chat always visible inline, dropdown renders a collapsible course-friendly panel |
| headerTitle | string | – | Title displayed in chat header |
| headerDescription | string | – | Subtitle in header |
| placeholder | string | – | Input placeholder text |
| suggestedMessages | string[] | [] | Prompt suggestions displayed before first message |
| userInitials | string | "YOU" | Avatar initials for user bubbles |
| assistantInitials | string | "AI" | Avatar initials for assistant bubbles |
| contextMessage | boolean \| string | false | Enable bundled BAWANA company context (true) or supply a custom string |
| primaryColor | string | #1168bb | Primary accent used for the header, launcher, and motion effects |
| primaryForeground | string | auto | Foreground color on top of the primary accent (auto-calculated for contrast) |
| temperature | number | 0.6 | Sampling temperature |
| maxOutputTokens | number | 512 | Maximum tokens in the model response |
| generateResponse | (request) => Promise<{ content: string }> | Built-in callOpenAI wrapper | Override the network transport (e.g. route through your API or mock in tests) |
| contextProfile | string \| false | "company-intro" | Select predefined context/policy bundle (set to false to disable) |
| systemPrompt | string | – | Short system instruction prepended to conversation |
| systemMessages | { content: string }[] | [] | Additional system messages before user history |
| onMessage | (event) => void | – | Fired after assistant replies |
| onError | (error) => void | – | Fired when OpenAI call fails |
| onOpen | () => void | – | Triggered when the widget opens |
| onClose | () => void | – | Triggered when the widget closes |
React only: pass
floating={false}to render the chat inline instead of the default floating widget.
Tip: If you need to customise the copy,
COMPANY_CONTEXTis still exported so you can compose your ownsystemMessages.
generateResponse receives an object that includes messages, the active AbortSignal, and a copy of the resolved config so you can talk to any LLM provider without touching the UI.
Need an inline course assistant? Pass layout="dropdown" (plus floating={false} in React) and the widget renders as a collapsible panel (closed by default). The outer title/description become the trigger label, while the inner chat header stays hidden so the panel doesn’t feel double-stacked.
🧩 Context Profiles & Policies
Context profiles bundle prompts, company context, history limits, and light policy checks so each chatbot instance can focus on a specific surface (e.g., company introduction vs. course pages).
company-intro(default): answers general questions about Netpolitan Group/BAWANA.course-page: keeps the conversation centred on one course/module and gently redirects off-topic queries.
Each profile can enforce topic allow-lists and custom error messages. For example, the default profile refuses unrelated requests ("please provide a noodle cooking tutorial") so the assistant stays focused on BAWANA, while short greetings or exploratory prompts are still allowed. Prompts run through a lightweight classifier that tags them as company, course, cooking, travel, etc., so the chatbot can decline out-of-scope topics before hitting OpenAI.
import { classifyPrompt } from "@rizal_ncc/bawanachat";
console.log(classifyPrompt("Bagaimana cara membuat bandeng goreng?"));
// ["cooking"]import { initChatbot, listContextProfiles } from "@rizal_ncc/bawanachat";
console.log(listContextProfiles());
initChatbot({
apiKey: "sk-...",
contextProfile: "course-page",
systemMessages: [
{ content: "Course slug: digital-leadership" },
],
});Set contextProfile: false if you want to bypass the built-in bundles and manage prompts/policies yourself.
🎨 Theming
Give the chatbot your brand colours by passing primaryColor (and optionally primaryForeground). The widget header, launcher button, card shadow, and toggle animations all re-use the same CSS variables, so a single change keeps everything consistent.
initChatbot({
apiKey: "...",
primaryColor: "#1d4ed8",
primaryForeground: "#f8fafc", // optional – picked automatically if omitted
});If you need access to the resolved palette (e.g. to sync surrounding UI), call chatbot.getConfig() – it returns the current config including the computed primaryColor, primaryForeground, and primaryRgb values.
🧠 API Overview
initChatbot(config: ChatbotConfig)
Creates a chatbot instance and mounts it into the provided container (or the body by default).
const chatbot = initChatbot({ apiKey: "sk-..." });
chatbot.open();
chatbot.sendMessage("Halo, BAWANA!");new ChatbotCore(config)
Fine-grained control over lifecycle, events, and state.
import { ChatbotCore } from "@rizal_ncc/bawanachat";
const chatbot = new ChatbotCore({ apiKey: "sk-..." });
chatbot.on("message", ({ response }) => console.log(response.content));
chatbot.init("chatbot-root");callOpenAI(options)
Direct access to OpenAI Responses API helper used internally.
import { callOpenAI } from "@rizal_ncc/bawanachat";
await callOpenAI({
apiKey: process.env.OPENAI_API_KEY,
messages: [
{ role: "system", content: "You are BAWANA AI." },
{ role: "user", content: "Apa saja fitur LXP?" },
],
});Custom transports via generateResponse
Route traffic through your own backend, Azure OpenAI, or a mocked responder by passing generateResponse to any adapter or directly to ChatbotCore.
const chatbot = new ChatbotCore({
apiKey: "placeholder", // optional if your backend handles auth
generateResponse: async ({ messages, signal }) => {
const res = await fetch("/api/chat", {
method: "POST",
signal,
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ messages }),
});
const data = await res.json();
return { content: data.reply, usage: data.usage };
},
});Because the transport is concentrated behind one function, local tests can inject a fake implementation that returns deterministic responses without hitting OpenAI.
listContextProfiles() / getContextProfile(id)
Inspect and reuse the shipped policy bundles.
import {
listContextProfiles,
getContextProfile,
} from "@rizal_ncc/bawanachat";
const profiles = listContextProfiles();
const courseProfile = getContextProfile("course-page");createDialogueEngine(options)
Build a profile-aware generator that you can pass into ChatbotCore's generateResponse hook.
import { createDialogueEngine, ChatbotCore } from "@rizal_ncc/bawanachat";
const generateResponse = createDialogueEngine({ profileId: "course-page" });
const chatbot = new ChatbotCore({
apiKey: "sk-...",
generateResponse,
});🛠 Development
npm install
npm run dev # Vite dev server
npm run build # Build library + types
npm test # Vitest smoke tests for ChatbotCore
npm run preview # Preview bundle locallySet the following environment variables for local demos:
cp .env.example .env
VITE_OPENAI_API_KEY="sk-..."
VITE_OPENAI_MODEL="gpt-4o-mini"
# Optional: VITE_OPENAI_BASE_URL="https://api.openai.com/v1"📚 Examples
Example projects live in the examples/ directory:
examples/react– React + Vite demo with runtime profile picker.examples/vue– Vue 3 demo that toggles context profiles via<select>binding.examples/vanilla– Browser-ready HTML example mounting the widget inline.
Each framework example is a self-contained Vite project so you can run it without extra wiring:
npm run build # from repo root – produces dist/* for the adapters
cd examples/react
npm install
npm run devnpm run build # from repo root – produces dist/* for the adapters
cd examples/vue
npm install
npm run devFor the vanilla sample, open examples/vanilla/index.html in a browser (or serve it through npx vite preview examples/vanilla).
The React and Vue demos import compiled assets from
../../dist/*, so always runnpm run buildin the repo root before starting those dev servers. They also read.envfrom the repository root (envDir), so setVITE_OPENAI_API_KEYthere once and both examples will pick it up.
Use npm link or npm pack to validate integration locally:
npm run build
npm pack
npm link🚀 Publishing Checklist
- [ ] Update
package.jsonversion - [ ]
npm run build - [ ]
npm pack --dry-runto inspect package contents - [ ] Update
CHANGELOG.md - [ ]
npm publish --access public - [ ]
git tag vX.Y.Z && git push --tags
🔐 Security & Quality
- Keep API keys out of version control
- Use GitHub Secrets for CI/CD tokens
- Run
npm audit fixregularly - Add automated lint/test workflows before production release
📄 License
Released under the MIT License.
