npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

boten-gemma

v0.2.1

Published

The AI assistant that asks before it acts. Self-hosted, multi-channel, confirmation-first.

Readme


Quick Start

npm install -g boten-gemma
gemma

That's it. On first run, an interactive setup wizard walks you through authentication, channels, and workspace configuration. Then open localhost:18789.

Or try it without installing:

npx boten-gemma

Why Gemma?

AI agents are powerful, but most give your LLM unrestricted access to your machine. One hallucinated rm -rf and you're in trouble. And when something does go wrong, you're staring at a chat log trying to figure out what happened.

Gemma is built around two ideas:

  1. The Confirmation Gate — Every file write, shell command, and API call goes through a gate. You see exactly what the AI wants to do, and you decide whether it happens. This isn't a policy check — the gate is the only code path to tool execution, enforced by module architecture.

  2. Extensible Panels — Build custom React or HTML panels that live inside Gemma's UI. Dashboards, data visualizers, control surfaces — anything you need to work alongside your AI visually. Gemma can even build them for you on the fly.

You (Telegram / Web / CLI)
  │
  ▼
┌──────────────────────────────────────┐
│  Gateway                             │
│                                      │
│  LLM ──► Agent Loop                  │
│             │                        │
│    ┌────────▼──────────┐             │
│    │ Confirmation Gate  │  ← approve / deny here
│    └────────┬──────────┘             │
│             │                        │
│    ┌────────▼──────────┐             │
│    │  Tool Executor     │  only reachable through the gate
│    └───────────────────┘             │
│                                      │
│  ┌────────────────────────────────┐  │
│  │  Panels                        │  │
│  │  Custom React/HTML UIs with    │  │
│  │  backend services & SDK        │  │
│  └────────────────────────────────┘  │
└──────────────────────────────────────┘

Features

| | Feature | What it does | |---|---------|-------------| | 🔒 | Confirmation Gate | Every tool call needs your approval. Configurable auto-approve rules. Cowboy mode for speed (but exec always asks). Full audit log. | | 🧩 | Panels | Build custom React/HTML UIs inside Gemma. Backend services with RPC, scoped storage, custom tools. Gemma can scaffold them for you. | | 📱 | Multi-Channel | Same assistant on Telegram (inline keyboard approvals), Web UI (diff previews), and CLI. Shared sessions. | | 🤖 | Pluggable LLMs | Gemini (default), OpenAI-compatible (Ollama, LM Studio, vLLM), or Google ADC. | | 🔌 | MCP Support | Any Model Context Protocol server. Hot-reload at runtime. | | 📝 | Skills | Teach new behaviors with markdown files. No code required. | | ⏰ | Cron & Heartbeat | Schedule recurring tasks with a safe-tool whitelist. Periodic check-ins. | | 🧠 | Memory | Persistent workspace, semantic search via embeddings, session transcripts. | | 🛠️ | 16 Built-in Tools | File I/O, shell, web search/fetch, cron, panels, config, and more — all gated. | | 🎨 | First-Run Bootstrap | Gemma interviews you on first launch — learns your name, preferences, and work style. |

The Confirmation Gate

The gate is not a feature flag or a wrapper. It's a structural guarantee: the tool executor module is imported by exactly one file (gate.ts) and never re-exported. There is no other code path to execute a tool.

Auto-approve rules let you skip confirmation for safe operations:

{
  "confirm": {
    "autoApprove": { "internalReads": true },
    "rules": [
      { "action": "allow", "tool": "read" },
      { "action": "allow", "tool": "exec", "args": "git *" },
      { "action": "deny", "tool": "write", "args": "/etc/*" }
    ]
  }
}

Rules use glob patterns with deny > ask > allow priority. Read-only tools (glob, grep, ls) are auto-approved by default.

Cowboy mode auto-approves everything except shell commands — exec always requires confirmation, even in cowboy mode. Every decision is logged to ~/.gemma/logs/audit.jsonl.

Panels

Panels are custom UIs that run inside Gemma's web interface as sandboxed iframes. Build dashboards, data views, or control surfaces — in plain HTML or React + TypeScript + shadcn/ui.

Ask Gemma to build one:

"Create a panel that shows my project's git log with a graph"

Or build your own with the Panel SDK:

const panel = new GemmaPanel();

// Scoped storage — read/write data private to your panel
const data = await panel.readData('settings.json');
await panel.writeData('cache.json', results);

// Send messages through the chat pipeline
panel.sendChat('Summarize the latest git commits');

// Call your panel's backend service
const stats = await panel.callBackend('getStats', { range: '7d' });

// Listen for real-time events from your backend
panel.onEvent('update', (data) => render(data));

Panel backends can register their own tools that Gemma can call — still gated through the confirmation system. Panels get automatic theme sync, hot-reload in dev mode, and workspace memory access.

Channels

Web UI

React + shadcn/ui with streaming, diff previews for approvals, and panel support.

gemma start
# → localhost:18789

Telegram

Approve or deny tool calls with inline keyboard buttons, right from your phone.

"telegram": {
  "enabled": true,
  "token": "bot-token",
  "allowFrom": [your-id]
}

CLI

Color-coded output, text-based confirmations.

gemma start --cli

Use Any LLM

// Gemini (default — free via OAuth, or use API key)
{ "model": { "provider": "gemini", "model": "gemini-2.5-flash" } }

// Ollama (local)
{ "model": { "provider": "openai-compat",
    "openaiCompat": { "baseUrl": "http://localhost:11434/v1", "model": "llama3.2" } } }

// Any OpenAI-compatible server (LM Studio, vLLM, etc.)
{ "model": { "provider": "openai-compat",
    "openaiCompat": { "baseUrl": "http://your-server/v1", "model": "your-model" } } }

Configuration

gemma.json with JSON5 comments, ${ENV_VAR} expansion, and Zod validation:

{
  "model": { "provider": "gemini", "auth": "api-key" },
  "channels": {
    "web": { "enabled": true },
    "telegram": { "enabled": true, "token": "${TELEGRAM_BOT_TOKEN}" }
  },
  "confirm": { "autoApprove": { "internalReads": true } },
  "mcp": {
    "servers": {
      "filesystem": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "~"] }
    }
  }
}

See gemma.example.json for all options.

Deployment

# Install and run as a background service
npm install -g boten-gemma
gemma setup
gemma daemon install   # launchd (macOS) or systemd (Linux)
gemma daemon start

All state lives in ~/.gemma/. Docker is also available in docker/.

Development

git clone https://github.com/challehallberg/gemma.git
cd gemma && npm install
npm run dev        # Gateway with auto-reload
npm run dev:ui     # Vite dev server on :5173 with HMR
npm run build      # Full production build

Contributing

Contributions welcome! See the open issues for things to work on.

License

MIT