@mooncompany/uplink-chat
v0.37.2
Published
Local-first AI chat with voice — talk to any LLM from your own machine
Downloads
2,420
Maintainers
Readme
Uplink
The chat interface for OpenClaw. Self-hosted, encrypted, with voice — access your AI from any device.
Uplink is a lightweight Node.js chat client built for OpenClaw. It gives you a private, encrypted interface to talk to any LLM that OpenClaw supports — OpenAI, Anthropic, Ollama, local models, or anything OpenAI-compatible. Add real-time voice chat. Access from your phone, tablet, or any browser on your network.
No accounts. No cloud. No data collection. Your conversations stay on your hardware.
How It Works
Uplink connects to your OpenClaw gateway, which handles AI provider routing, model selection, and agent orchestration. You bring your own API keys, OpenClaw manages the connections, and Uplink gives you the UI.
Browser → Uplink (localhost) → OpenClaw Gateway → AI ProvidersFeatures
- Any LLM — Whatever OpenClaw supports: OpenAI, Anthropic, Ollama, LM Studio, xAI, DeepSeek, and more
- Voice Chat — Real-time text-to-speech and speech-to-text with multiple engine options
- Encrypted Storage — Optional password protection for your chat history
- PWA — Install as an app on desktop or mobile
- Satellites — Connect multiple AI providers and switch between them on the fly
- Agent Management — Create, configure, and route between OpenClaw agents directly from the UI
- Agent-to-Agent Communication — Agents can message each other through satellite connections
- Dashboard — At-a-glance view of conversations, agents, and provider status
- Split View — Multiple conversations or panels side by side
- File Uploads — Images, audio, video, PDFs, DOCX, Excel — attach anything to messages
- 5 Themes — Midnight (default), Daylight, Ember, Forest, Noir
- Keyboard Shortcuts — Full keyboard navigation
- Mobile Ready — Responsive UI, works great on phones via local network
- Structured Logging —
LOG_LEVELsupport (debug, info, warn, error, silent)
Prerequisites
- OpenClaw installed and running
- Chat completions endpoint enabled in your OpenClaw config:
openclaw config set gateway.http.endpoints.chatCompletions.enabled true- Node.js 18+
- A browser (Chrome/Edge recommended for voice)
Getting Started
- Install and start Uplink
- Enter your OpenClaw gateway URL and token during onboarding
- Start chatting
All settings — providers, voice, themes, encryption — are configured through the web UI.
Voice Engines
Text-to-Speech
| Engine | Cost | Setup | |--------|------|-------| | Edge TTS | Free | Works out of the box | | OpenAI TTS | Paid | API key via OpenClaw | | Coqui XTTS | Free | Local GPU server | | ElevenLabs | Paid | API key |
Speech-to-Text
| Engine | Cost | Setup | |--------|------|-------| | Browser STT | Free | Built into your browser | | Faster-Whisper | Free | Local server | | Groq Whisper | Free tier | API key via OpenClaw | | OpenAI Whisper | Paid | API key via OpenClaw |
CLI
uplink-chat # Start server
uplink-chat start # Start server
uplink-chat start -d # Start in background
uplink-chat stop # Stop background server
uplink-chat status # Check if running
uplink-chat --port 8080 # Custom portRemote Access
Uplink runs on localhost:3456 by default. To access from other devices:
- Same network — Use your computer's local IP (e.g.
http://192.168.1.100:3456) - Anywhere — Tailscale for a private encrypted tunnel (recommended)
- Advanced — Cloudflare Tunnel or your own reverse proxy
Microphone access requires HTTPS or localhost. Tailscale provides this automatically.
Troubleshooting
| Problem | Fix |
|---------|-----|
| No AI response | Enable chatCompletions endpoint in OpenClaw config |
| 401 errors | Check gateway token matches in both Uplink and OpenClaw |
| Mic not working on mobile | Access via HTTPS (Tailscale) or localhost |
| Voice not working | Check voice settings — Edge TTS works with no setup |
| WebSocket 1006 via Tailscale + WSL | Install Tailscale inside WSL, or run Uplink on Windows directly |
License
Proprietary — Moon Company LLC
