npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@andypai/orb

v0.1.1

Published

Voice-driven code explorer for your terminal

Readme

Why Orb

Orb is a Bun-native terminal app for exploring real codebases with Anthropic or OpenAI models. It keeps the interface focused, shows tool activity as it happens, remembers project conversations, and can read answers aloud through tts-gateway or macOS say.

Features

  • Natural language queries - Ask questions about your code in plain English
  • Live tool activity - See file reads, shell commands, and exploration steps as they happen
  • Voice input friendly - Paste transcriptions from MacWhisper for hands-free interaction
  • Streaming TTS (serve mode) - Hear answers while they are still being generated
  • Provider selection - Choose Anthropic (Claude) or OpenAI via CLI flags
  • Model switching (Claude) - Cycle Anthropic models during a conversation with Shift+Tab
  • Session persistence - Automatically resume the last session per project
  • Focused terminal UI - Ink-based interface with conversation history, tool activity, and the Orb intro

Installation

Global install

# With Bun
bun install -g @andypai/orb

# With npm (Bun is still required at runtime)
npm install -g @andypai/orb

Local / one-off use

# Run without installing globally
bunx @andypai/orb

# Add to a Bun project
bun add @andypai/orb

# npm also works, but Bun is still required at runtime
npm install @andypai/orb

60-Second Quick Start

1. Set up an LLM provider

  • Anthropic: sign in with Claude Code / Max, or set ANTHROPIC_API_KEY
  • OpenAI: set OPENAI_API_KEY

If you do not pass --provider or --model, Orb auto-selects a provider in this order:

  1. Claude Agent SDK (Claude Code / Max or API key)
  2. OPENAI_API_KEY
  3. ANTHROPIC_API_KEY

2. Pick your speech path

Fastest path with no speech

orb --no-tts

Fastest path on macOS (batch speech)

orb --tts-mode=generate

Generate mode uses macOS built-ins (say and afplay) and does not require tts-gateway.

Recommended path for streaming speech

uv tool install tts-gateway[kokoro]
~/.local/share/uv/tools/tts-gateway/bin/python -m spacy download en_core_web_sm
tts serve --provider kokoro --port 8000

That spaCy install is important: Kokoro’s first request will crash in a plain uv tool install unless en_core_web_sm is installed into the tts-gateway tool environment.

Orb expects tts-gateway at http://localhost:8000 by default and automatically targets POST /tts.

3. Run Orb

# Explore the current directory
orb

# Guided setup for persistent defaults
orb setup

# Explore a specific project
orb /path/to/project

Usage

# Anthropic with options
orb --model=sonnet --voice=marius
orb --provider=anthropic --model=opus

# OpenAI provider
orb --provider=openai --model=gpt-5.4
orb --model=openai:gpt-5.4

# Fresh conversation
orb --new

# Skip the intro animation
orb --skip-intro

Options

| Option | Description | Default | | ------------------------ | ------------------------------------------------------------------------------ | --------------------------------------- | | --provider=<provider> | LLM provider: anthropic|claude, openai|gpt (alias: --llm-provider) | auto | | --model=<model> | Model ID or alias (haiku, sonnet, opus) or provider:model | haiku (anthropic), gpt-5.4 (openai) | | --voice=<voice> | TTS voice: alba, marius, jean | alba | | --tts-mode=<mode> | serve for tts-gateway, generate for local macOS say | serve | | --tts-server-url=<url> | Serve-mode gateway URL | http://localhost:8000 | | --tts-speed=<rate> | TTS speed multiplier | 1.5 | | --new | Start fresh (ignore saved session) | - | | --skip-intro | Skip the welcome animation | - | | --no-tts | Disable text-to-speech | - | | --no-streaming-tts | Disable streaming (batch mode) | - | | --help | Show help message | - |

Controls

  • Type your question and press Enter to submit
  • Paste MacWhisper transcription with Cmd+V
  • Press Esc or Ctrl+S to stop speech
  • Press Shift+Tab to cycle Claude models (Anthropic only)
  • Press Ctrl+O to toggle live tool-call details
  • Press Ctrl+C to exit

TTS Setup

Orb supports two TTS paths:

  • Serve mode (default): send speech requests to a local tts-gateway server
  • Generate mode: use macOS built-in say for local fallback speech

Serve mode

Serve mode gives Orb the best experience for low-latency streaming speech.

Install and start tts-gateway

uv tool install tts-gateway[kokoro]
~/.local/share/uv/tools/tts-gateway/bin/python -m spacy download en_core_web_sm
tts serve --provider kokoro --port 8000

Verify the server

curl http://localhost:8000/health
curl -X POST http://localhost:8000/tts -F 'text=hello from orb' -o /tmp/orb-check.wav

Then run Orb with defaults:

orb

If you use a different host or port:

orb --tts-server-url=http://localhost:9000

You can also save that value permanently with orb setup or tts.server_url in ~/.orb/config.toml.

Voice notes

Orb exposes three portable voice presets: alba, marius, and jean.

Some tts-gateway providers use different internal voice names. Orb already retries once without an explicit voice if the gateway rejects a preset, so a working server default will still speak.

Generate mode

On macOS, generate mode works out of the box with the built-in say command:

orb --tts-mode=generate

If you want advanced voices, non-macOS support, or streaming playback while the model is still responding, use serve mode with tts-gateway instead.

Disable TTS

orb --no-tts

Provider Setup

Orb supports two LLM providers: Anthropic (Claude) and OpenAI.

Anthropic (default)

Anthropic uses the Claude Agent SDK. Orb can reuse a local Claude Code / Max-authenticated session when available, or fall back to ANTHROPIC_API_KEY / CLAUDE_API_KEY.

Quick start

# Uses Anthropic by default when available
orb

# Explicitly specify Anthropic
orb --provider=anthropic

# Use model aliases
orb --model=haiku
orb --model=sonnet
orb --model=opus

# Or use a full model ID
orb --model=claude-haiku-4-5-20251001

Available models

  • claude-haiku-4-5-20251001 (default, alias: haiku)
  • claude-sonnet-4-6 (alias: sonnet)
  • claude-opus-4-6 (alias: opus)

If you are not already signed in through Claude Code / Max, set ANTHROPIC_API_KEY or CLAUDE_API_KEY before starting Orb.

For setup details, see the Claude Agent SDK quickstart and the Claude models overview.

OpenAI

OpenAI support uses the official OpenAI Responses API and requires OPENAI_API_KEY.

Quick start

export OPENAI_API_KEY=sk-...

orb --provider=openai
orb --provider=openai --model=gpt-5.4
orb --model=openai:gpt-5.4

Common models

  • gpt-5.4 (default for OpenAI)
  • gpt-5
  • gpt-4o
  • gpt-5.4-mini

Note: OpenAI runs in a sandboxed environment via bash-tool. File edits happen in a sandbox overlay and are not applied directly to your working tree. Orb will describe changes it made so you can apply them yourself.

Global Config

Persistent defaults live in ~/.orb/config.toml. CLI flags override config values for one-off runs.

The easiest way to create the file is:

orb setup

A typical config looks like:

provider = "anthropic"
model = "claude-haiku-4-5-20251001"
skip_intro = false

[tts]
enabled = true
streaming = true
mode = "serve"
server_url = "http://localhost:8000"
voice = "alba"
speed = 1.5
buffer_sentences = 1
clause_boundaries = false
min_chunk_length = 15
max_wait_ms = 150
grace_window_ms = 50

Config-only advanced tuning keys live under [tts]:

  • buffer_sentences
  • clause_boundaries
  • min_chunk_length
  • max_wait_ms
  • grace_window_ms

Sessions are stored under ~/.orb/sessions/ (one per project).

Customizing Prompts

Orb’s built-in instructions live in the root-level prompts/ directory:

  • prompts/base.md for shared behavior
  • prompts/anthropic.md for Anthropic-specific system instructions
  • prompts/openai.md for OpenAI-specific tool and sandbox instructions
  • prompts/voice.md for voice-mode guidance added when TTS is enabled

Prompt files are read fresh for each run, so edits apply to the next question without rebuilding the app.

Requirements

  • Runtime: Bun >= 1.1
  • LLM provider: Anthropic or OpenAI authentication
  • TTS (optional): tts-gateway for serve mode, or macOS say and afplay for generate mode

Development

git clone https://github.com/andypai/orb.git
cd orb
bun install

# Run in development
bun run dev

# Run with OpenAI
bun run dev --provider=openai --model=gpt-5.4

# Checks
bun run check
bun run test

License

MIT