eloquent-chat-widget
v1.0.13
Published
A reusable embeddable chat widget built with React and TypeScript
Maintainers
Readme
🧠 Eloquent Chat Widget
A fully embeddable and customizable React chat widget powered by Ollama and Llama 3, designed for seamless integration into any web application. Built with TypeScript, styled according to the Eloquent AI design language (purple theme + logo), and published on NPM.


🚀 Demo and Installation
📦 Install via NPM:
npm install eloquent-chat-widgetor
yarn add eloquent-chat-widget💡 Also available at:
- NPM: https://www.npmjs.com/package/eloquent-chat-widget
- GitHub: https://github.com/MatheusCPimentel/eloquent-chat
⚙️ Features
- ✅ Position selector: bottom-right, bottom-left, top-right, or top-left.
- ✅ Light and dark theme.
- ✅ Custom title, subtitle, placeholder, and logo.
- ✅ Maintenance mode — disables input and shows a maintenance state.
- ✅ Error mode — if an error occurs, the UI allows the user to retry.
- ✅ Local persistence — messages are stored locally using
localStorage. - ✅ Works with Ollama + Llama 3 locally.
- ✅ Typescript support out of the box.
- ✅ 100% standalone styling — doesn't conflict with the host website's styles.
- ✅ No need for Tailwind in the host project — styles are embedded.
🎨 Customization
<ChatWidget
position="bottom-right" // bottom-right | bottom-left | top-right | top-left
theme="dark" // dark | light
title="Eloquent Chat"
placeholder="Ask me anything..."
maintenanceMode={false} // true disables input (maintenance screen)
logo="/path/to/your/logo.png" // (Optional) default is Eloquent AI logo
onError={(error) => console.error(error)} // (Optional) error callback
/>💻 Ollama + Llama3 Setup
This widget uses Ollama running Llama 3 locally.
👉 Install Ollama:
https://ollama.com/download👉 Run the Llama3 model locally:
ollama pull llama3
ollama serve➡️ The widget will connect to http://localhost:11434 by default.
🔥 Architectural Decisions
- Initially, everything was developed in a single component, but as complexity grew, it was refactored into smaller, reusable components for better readability and maintainability.
- The business logic (message handling, API interaction, state management) became large, leading to the creation of a dedicated custom hook (
useChat) to encapsulate this logic cleanly. - First attempted with OpenAI API, but realized it required a paid API key. After research, Ollama + Llama 3 was selected for running models locally for free.
- Addressed UX issues like auto-scroll to the bottom on new messages and the ability to clear user messages without losing AI context.
🚧 Challenges Faced
- NPM packaging with Tailwind: By default, Tailwind needs to be configured in the consuming project. To avoid this, I embedded all the generated CSS directly into the component — meaning no Tailwind installation is required for users.
- Build separation: Managing CSS and JS build processes separately and ensuring the dist folder was clean but not deleting generated assets — handled via
tsup.config.ts. - Component UX: Handling scroll behavior, retry mechanisms on error, and a clear but functional maintenance mode.
✨ Interaction with Host Website Styles
To comply with the requirement "Consider how it will interact with the host website’s styles and scripts", the widget:
- Ships with precompiled CSS scoped to the widget itself.
- Doesn't require Tailwind or any external styles in the host application.
- Fully encapsulated — styles won't leak in or out.
🚀 How to Build, Package, and Publish
✅ Clone the repo:
git clone https://github.com/MatheusCPimentel/eloquent-chat.git
cd eloquent-chat✅ Install dependencies:
npm install✅ Build the package:
npm run build✅ Publish to NPM locally for testing:
npm pack➡️ This generates a .tgz file which can be installed into any project:
npm install ./eloquent-chat-widget-1.0.0.tgz🧑💻 Install & Use in Any HTML Page (React-based):
- Initialize a React app (Next.js, Vite, CRA — anything).
- Install the widget:
npm install eloquent-chat-widget- Import and use:
import { ChatWidget } from "eloquent-chat-widget";
import "eloquent-chat-widget/dist/chat-widget.css"; // Import styles
export default function App() {
return <ChatWidget />;
}💡 Summary of Approach
- ✅ Focused on creating a fully reusable, framework-agnostic widget.
- ✅ Refactored for clarity, maintainability, and clean separation of concerns (components + hooks + services).
- ✅ Overcame NPM packaging challenges like Tailwind embedding, asset handling, and clean builds.
- ✅ Chose Ollama + Llama3 as a free, local alternative to OpenAI.
- ✅ Delivered a widget that’s simple to install, beautiful (Eloquent AI purple theme), and robust.
🧠 System Architecture
This diagram represents how the chat widget interacts with the local Ollama API (Llama 3). It covers the main states of the application, including loading, error handling, maintenance mode, and the user interaction flow.

