npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

eloquent-chat-widget

v1.0.13

Published

A reusable embeddable chat widget built with React and TypeScript

Readme

🧠 Eloquent Chat Widget

A fully embeddable and customizable React chat widget powered by Ollama and Llama 3, designed for seamless integration into any web application. Built with TypeScript, styled according to the Eloquent AI design language (purple theme + logo), and published on NPM.

Chat Light Mode

Chat Dark Mode


🚀 Demo and Installation

📦 Install via NPM:

npm install eloquent-chat-widget

or

yarn add eloquent-chat-widget

💡 Also available at:


⚙️ Features

  • Position selector: bottom-right, bottom-left, top-right, or top-left.
  • Light and dark theme.
  • Custom title, subtitle, placeholder, and logo.
  • Maintenance mode — disables input and shows a maintenance state.
  • Error mode — if an error occurs, the UI allows the user to retry.
  • Local persistence — messages are stored locally using localStorage.
  • Works with Ollama + Llama 3 locally.
  • Typescript support out of the box.
  • 100% standalone styling — doesn't conflict with the host website's styles.
  • No need for Tailwind in the host project — styles are embedded.

🎨 Customization

<ChatWidget
  position="bottom-right" // bottom-right | bottom-left | top-right | top-left
  theme="dark" // dark | light
  title="Eloquent Chat"
  placeholder="Ask me anything..."
  maintenanceMode={false} // true disables input (maintenance screen)
  logo="/path/to/your/logo.png" // (Optional) default is Eloquent AI logo
  onError={(error) => console.error(error)} // (Optional) error callback
/>

💻 Ollama + Llama3 Setup

This widget uses Ollama running Llama 3 locally.

👉 Install Ollama:

https://ollama.com/download

👉 Run the Llama3 model locally:

ollama pull llama3
ollama serve

➡️ The widget will connect to http://localhost:11434 by default.


🔥 Architectural Decisions

  • Initially, everything was developed in a single component, but as complexity grew, it was refactored into smaller, reusable components for better readability and maintainability.
  • The business logic (message handling, API interaction, state management) became large, leading to the creation of a dedicated custom hook (useChat) to encapsulate this logic cleanly.
  • First attempted with OpenAI API, but realized it required a paid API key. After research, Ollama + Llama 3 was selected for running models locally for free.
  • Addressed UX issues like auto-scroll to the bottom on new messages and the ability to clear user messages without losing AI context.

🚧 Challenges Faced

  • NPM packaging with Tailwind: By default, Tailwind needs to be configured in the consuming project. To avoid this, I embedded all the generated CSS directly into the component — meaning no Tailwind installation is required for users.
  • Build separation: Managing CSS and JS build processes separately and ensuring the dist folder was clean but not deleting generated assets — handled via tsup.config.ts.
  • Component UX: Handling scroll behavior, retry mechanisms on error, and a clear but functional maintenance mode.

✨ Interaction with Host Website Styles

To comply with the requirement "Consider how it will interact with the host website’s styles and scripts", the widget:

  • Ships with precompiled CSS scoped to the widget itself.
  • Doesn't require Tailwind or any external styles in the host application.
  • Fully encapsulated — styles won't leak in or out.

🚀 How to Build, Package, and Publish

✅ Clone the repo:

git clone https://github.com/MatheusCPimentel/eloquent-chat.git
cd eloquent-chat

✅ Install dependencies:

npm install

✅ Build the package:

npm run build

✅ Publish to NPM locally for testing:

npm pack

➡️ This generates a .tgz file which can be installed into any project:

npm install ./eloquent-chat-widget-1.0.0.tgz

🧑‍💻 Install & Use in Any HTML Page (React-based):

  1. Initialize a React app (Next.js, Vite, CRA — anything).
  2. Install the widget:
npm install eloquent-chat-widget
  1. Import and use:
import { ChatWidget } from "eloquent-chat-widget";
import "eloquent-chat-widget/dist/chat-widget.css"; // Import styles

export default function App() {
  return <ChatWidget />;
}

💡 Summary of Approach

  • ✅ Focused on creating a fully reusable, framework-agnostic widget.
  • ✅ Refactored for clarity, maintainability, and clean separation of concerns (components + hooks + services).
  • ✅ Overcame NPM packaging challenges like Tailwind embedding, asset handling, and clean builds.
  • ✅ Chose Ollama + Llama3 as a free, local alternative to OpenAI.
  • ✅ Delivered a widget that’s simple to install, beautiful (Eloquent AI purple theme), and robust.

🧠 System Architecture

This diagram represents how the chat widget interacts with the local Ollama API (Llama 3). It covers the main states of the application, including loading, error handling, maintenance mode, and the user interaction flow.

System Architecture