npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

libre-webui

v0.3.2

Published

Privacy-first AI chat interface. Self-hosted, open source, extensible.

Readme

Libre WebUI

Privacy-First AI Chat Interface

WebsiteDocumentation𝕏SponsorGet Started


Why Libre WebUI?

A simple, self-hosted interface for AI chat. Run it locally with Ollama, connect to OpenAI, Anthropic, or 9+ providers—all from one UI.

  • Your data stays yours — Zero telemetry, fully self-hosted
  • Extensible plugin system — Ollama, OpenAI, Anthropic, and any OpenAI-compatible API
  • Simple & focused — Keyboard shortcuts, dark mode, responsive design

Features

Core Experience

  • Real-time streaming chat
  • Dark/light themes
  • VS Code-style keyboard shortcuts
  • Mobile-responsive design
  • Native Desktop App — macOS (Windows & Linux coming soon)

AI Providers

  • Local: Ollama (full integration)
  • Cloud: OpenAI, Anthropic, Google, Groq, Mistral, OpenRouter, and more
  • Plugin System — Add any OpenAI-compatible API via JSON config

Advanced Capabilities

  • Document Chat (RAG) — Upload PDFs, chat with your docs
  • Custom Personas — AI personalities with memory
  • Interactive Artifacts — Live HTML, SVG, code preview
  • Text-to-Speech — Multiple voices and providers
  • SSO Authentication — GitHub, Hugging Face OAuth

Security

  • AES-256-GCM encryption
  • Role-based access control
  • Enterprise compliance ready

Quick Start

Requirements: Ollama (for local AI) or API keys for cloud providers

One Command Install

npx libre-webui

That's it. Opens at http://localhost:8080

Docker

| Setup | Command | | ----------------------------------------- | ------------------------------------------------------------ | | Bundled Ollama (CPU) | docker-compose up -d | | Bundled Ollama (NVIDIA GPU) | docker-compose -f docker-compose.gpu.yml up -d | | External Ollama (already running on host) | docker-compose -f docker-compose.external-ollama.yml up -d |

Access at http://localhost:8080

Warning: Development builds are automatically generated from the dev branch and may contain experimental features, breaking changes, or bugs. Use at your own risk and do not use in production environments.

| Setup | Command | | --------------------------------- | ---------------------------------------------------------------- | | Dev + Bundled Ollama (CPU) | docker-compose -f docker-compose.dev.yml up -d | | Dev + Bundled Ollama (NVIDIA GPU) | docker-compose -f docker-compose.dev.gpu.yml up -d | | Dev + External Ollama | docker-compose -f docker-compose.dev.external-ollama.yml up -d |

Development builds use separate data volumes (libre_webui_dev_data) to prevent conflicts with stable installations.

To pull the latest dev image manually:

docker pull librewebui/libre-webui:dev

Kubernetes (Helm)

helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui
# With external Ollama
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ollama.bundled.enabled=false \
  --set ollama.external.enabled=true \
  --set ollama.external.url=http://my-ollama:11434

# With NVIDIA GPU support
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ollama.bundled.gpu.enabled=true

# With Ingress
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
  --set ingress.enabled=true \
  --set ingress.hosts[0].host=chat.example.com

See helm/libre-webui/values.yaml for all configuration options.

Development Setup

# 1. Clone the repo
git clone https://github.com/libre-webui/libre-webui
cd libre-webui

# 2. Configure environment
cp backend/.env.example backend/.env

# 3. Install and run
npm install && npm run dev

Configuration

Edit backend/.env to add your API keys:

# Local AI (Ollama)
OLLAMA_BASE_URL=http://localhost:11434

# Cloud AI Providers (add the ones you need)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

Desktop App (In Development)

Note: The desktop app is currently in active development. The macOS build is pending Apple notarization, which may cause security warnings or installation issues on some systems. We're working to resolve this. Feedback and bug reports are welcome!

Download the native desktop app from GitHub Releases:

| Platform | Status | | --------------------- | ----------------------- | | macOS (Apple Silicon) | Beta (.dmg or .zip) | | Windows | Coming soon | | Linux | Coming soon |


[!NOTE]

Enterprise Services

Need a custom deployment? Kroonen AI provides professional services for Libre WebUI deployments.

| Service | Use Case | | ----------------------------- | ------------------------------------- | | On-premise & cloud deployment | HIPAA, SOC 2, air-gapped environments | | SSO integration | Okta, Azure AD, SAML, LDAP | | Custom development | Integrations, white-labeling, plugins | | SLA-backed support | Priority response, dedicated channel |

Contact: [email protected] | Learn more →

[!TIP]

Support Development

Libre WebUI is built and maintained independently. Your support keeps it free and open source.

Sponsor

Become a Sponsor — Help fund active development


Community


Apache 2.0 License • Copyright © 2025–present Libre WebUI™

Built & maintained by Kroonen AIEnterprise Support