opengauge
v0.1.6
Published
A local-first, token-efficient LLM chat interface
Downloads
688
Maintainers
Readme
OpenGauge
A local-first, token-efficient LLM chat interface.
If you just want to use it, you only need one command:
npx opengaugeWhat this repo does
OpenGauge runs a local web chat app with:
- Token optimization (compression + deduplication + checkpoints)
- Context retrieval from conversation history (RAG-style)
- Multiple providers: Anthropic, OpenAI, Gemini, Ollama
- Local storage in SQLite at
~/.opengauge/opengauge.db
Use via npx (recommended)
npx opengaugeYou do not need to install OpenGauge globally first.
npx will download the package (if not already cached) and run it.
On first run, it may take a little longer while it fetches the package.
This starts a local server and opens the app in your browser.
Default URL:
http://localhost:3000If port 3000 is busy:
npx opengauge --port 3001First-time setup
When the app opens, configure your provider in the UI wizard, or create:
~/.opengauge/config.yml
Example:
providers:
anthropic:
api_key: YOUR_API_KEY
default_model: claude-opus-4-6
defaults:
provider: anthropicDeveloper setup (from source)
git clone https://github.com/applytorque/opengauge.git
cd opengauge
npm install
npm run build
npm startUseful commands
npm run build # Compile TypeScript + copy UI assets
npm start # Run CLI entry locally
npm pack --dry-run # Preview npm package contentsLicense
MIT
