libre-webui
v0.3.2
Published
Privacy-first AI chat interface. Self-hosted, open source, extensible.
Maintainers
Readme
Libre WebUI
Privacy-First AI Chat Interface
Website • Documentation • 𝕏 • Sponsor • Get Started
Why Libre WebUI?
A simple, self-hosted interface for AI chat. Run it locally with Ollama, connect to OpenAI, Anthropic, or 9+ providers—all from one UI.
- Your data stays yours — Zero telemetry, fully self-hosted
- Extensible plugin system — Ollama, OpenAI, Anthropic, and any OpenAI-compatible API
- Simple & focused — Keyboard shortcuts, dark mode, responsive design
Features
Core Experience
- Real-time streaming chat
- Dark/light themes
- VS Code-style keyboard shortcuts
- Mobile-responsive design
- Native Desktop App — macOS (Windows & Linux coming soon)
AI Providers
- Local: Ollama (full integration)
- Cloud: OpenAI, Anthropic, Google, Groq, Mistral, OpenRouter, and more
- Plugin System — Add any OpenAI-compatible API via JSON config
Advanced Capabilities
- Document Chat (RAG) — Upload PDFs, chat with your docs
- Custom Personas — AI personalities with memory
- Interactive Artifacts — Live HTML, SVG, code preview
- Text-to-Speech — Multiple voices and providers
- SSO Authentication — GitHub, Hugging Face OAuth
Security
- AES-256-GCM encryption
- Role-based access control
- Enterprise compliance ready
Quick Start
Requirements: Ollama (for local AI) or API keys for cloud providers
One Command Install
npx libre-webuiThat's it. Opens at http://localhost:8080
Docker
| Setup | Command |
| ----------------------------------------- | ------------------------------------------------------------ |
| Bundled Ollama (CPU) | docker-compose up -d |
| Bundled Ollama (NVIDIA GPU) | docker-compose -f docker-compose.gpu.yml up -d |
| External Ollama (already running on host) | docker-compose -f docker-compose.external-ollama.yml up -d |
Access at http://localhost:8080
Warning: Development builds are automatically generated from the
devbranch and may contain experimental features, breaking changes, or bugs. Use at your own risk and do not use in production environments.
| Setup | Command |
| --------------------------------- | ---------------------------------------------------------------- |
| Dev + Bundled Ollama (CPU) | docker-compose -f docker-compose.dev.yml up -d |
| Dev + Bundled Ollama (NVIDIA GPU) | docker-compose -f docker-compose.dev.gpu.yml up -d |
| Dev + External Ollama | docker-compose -f docker-compose.dev.external-ollama.yml up -d |
Development builds use separate data volumes (libre_webui_dev_data) to prevent conflicts with stable installations.
To pull the latest dev image manually:
docker pull librewebui/libre-webui:devKubernetes (Helm)
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui# With external Ollama
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
--set ollama.bundled.enabled=false \
--set ollama.external.enabled=true \
--set ollama.external.url=http://my-ollama:11434
# With NVIDIA GPU support
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
--set ollama.bundled.gpu.enabled=true
# With Ingress
helm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webui \
--set ingress.enabled=true \
--set ingress.hosts[0].host=chat.example.comSee helm/libre-webui/values.yaml for all configuration options.
Development Setup
# 1. Clone the repo
git clone https://github.com/libre-webui/libre-webui
cd libre-webui
# 2. Configure environment
cp backend/.env.example backend/.env
# 3. Install and run
npm install && npm run devConfiguration
Edit backend/.env to add your API keys:
# Local AI (Ollama)
OLLAMA_BASE_URL=http://localhost:11434
# Cloud AI Providers (add the ones you need)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...Desktop App (In Development)
Note: The desktop app is currently in active development. The macOS build is pending Apple notarization, which may cause security warnings or installation issues on some systems. We're working to resolve this. Feedback and bug reports are welcome!
Download the native desktop app from GitHub Releases:
| Platform | Status |
| --------------------- | ----------------------- |
| macOS (Apple Silicon) | Beta (.dmg or .zip) |
| Windows | Coming soon |
| Linux | Coming soon |
[!NOTE]
Enterprise Services
Need a custom deployment? Kroonen AI provides professional services for Libre WebUI deployments.
| Service | Use Case | | ----------------------------- | ------------------------------------- | | On-premise & cloud deployment | HIPAA, SOC 2, air-gapped environments | | SSO integration | Okta, Azure AD, SAML, LDAP | | Custom development | Integrations, white-labeling, plugins | | SLA-backed support | Priority response, dedicated channel |
Contact: [email protected] | Learn more →
[!TIP]
Support Development
Libre WebUI is built and maintained independently. Your support keeps it free and open source.
Become a Sponsor — Help fund active development
Community
- Ethical Charter — Our commitment to privacy, freedom & transparency
- Contributing — Help improve Libre WebUI
- 𝕏 @librewebui — Follow for updates
- Mastodon — Fediverse updates
- GitHub Issues — Bug reports & feature requests
- Documentation — Guides & API reference
Apache 2.0 License • Copyright © 2025–present Libre WebUI™
Built & maintained by Kroonen AI • Enterprise Support
