npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

open-brevya

v0.1.10

Published

Open Brevya — coding-agent CLI (fork of OpenBrevya) for OpenAI, Gemini, Ollama, Codex, and 200+ models

Readme

Open Brevya

Open Brevya is an open-source coding-agent CLI for cloud and local model providers. It is a fork of OpenBrevya with the terminal command openbrevya (instead of OpenBrevya).

Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping one terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.

Repository: github.com/pablorodriguesnunes/open-brevya

Ollama in three steps (Brevya does the rest)

For the default local Ollama workflow you do not need API keys, export lines, or provider wizard setup:

  1. Install Ollama and leave it running.
  2. Pull the bundled default model: ollama pull qwen3.5:latest (use another tag if you prefer — see pinned defaults or BREVYA_SKIP_PINNED_DEFAULTS=1 for full manual control).
  3. Install and run Open Brevya:
    npm install -g open-brevya
    openbrevya

The CLI wires itself to your local Ollama (http://localhost:11434/v1), picks the default model, and starts in dark mode. Everything else is optional — cloud providers, saved profiles, and advanced setup are below if you need them.

PR Checks Release Discussions Security Policy License

Ollama in three steps | Quick Start | Release 0.1.10 | Setup Guides | Providers | Source Build | VS Code Extension | Community

Release 0.1.10

Patch release: same scope as 0.1.9 for users and documentation; 0.1.10 only ships minor corrections and polish.

  • Select + Enter: fixed provider wizard and other <Select> lists where Enter could be swallowed without confirming the highlighted option (focus / keybinding interaction).
  • Pinned Ollama defaults (no shell exports required): at startup the CLI applies CLAUDE_CODE_USE_OPENAI=1, OPENAI_BASE_URL=http://localhost:11434/v1, and OPENAI_MODEL=qwen3.5:latest from src/utils/brevyaPinnedDefaults.ts. Set BREVYA_SKIP_PINNED_DEFAULTS=1 to use upstream-style env-only configuration. Optional mirror for other tools: env/openbrevya-ollama.sh.
  • Dark theme default: the TUI starts in dark mode every session (see ThemeProvider).
  • Ollama-only quickstart (Portuguese): step-by-step install Ollama, pull qwen3.5:latest, optional env vars, and openbrevya — see README-Ollama.md.

Why Open Brevya

  • Same feature set as upstream OpenBrevya, with a distinct CLI name: openbrevya
  • Use one CLI across cloud APIs and local model backends
  • Save provider profiles inside the app with /provider
  • Run with OpenAI-compatible services, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported providers
  • Keep coding-agent workflows in one place: bash, file tools, grep, glob, agents, tasks, MCP, and web tools
  • Use the bundled VS Code extension for launch integration and theme support (default launch command: openbrevya)

Quick Start

Install from npm (recommended)

npm install -g open-brevya

Then start the CLI:

openbrevya

The published tarball includes a pre-built dist/ (via the prepack script). You only need Node.js 20+ on your machine.

Maintainers: publish with npm login then npm publish from the repo root. prepack runs npm install (fills node_modules if needed) and npm run build (uses npx bun to bundle — you only need Node.js and npm, not a global Bun install). The resulting tarball includes dist/ for end users.

Upstream OpenBrevya (optional)

If you prefer the original project and the openbrevya command name:

npm install -g @pablorodriguesnunes/open-brevya
openbrevya

Install from source (development)

git clone https://github.com/pablorodriguesnunes/open-brevya.git
cd open-brevya
npm install    # or: bun install
bun run build
npx openbrevya

dist/ is not committed to git; building requires Bun because bun run build drives the bundler.

If the CLI reports ripgrep not found, install ripgrep system-wide and confirm rg --version works in the same terminal before starting Open Brevya.

Inside Open Brevya:

  • run /provider for guided provider setup and saved profiles
  • run /onboard-github for GitHub Models onboarding

Fastest OpenAI setup

macOS / Linux:

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_MODEL=gpt-4o

openbrevya

Windows PowerShell:

$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"

openbrevya

Fastest local Ollama setup

You usually don’t need this section — see Ollama in three steps above. The CLI already applies the same settings as src/utils/brevyaPinnedDefaults.ts.

Optional: mirror those variables in your shell for other tools, or when using BREVYA_SKIP_PINNED_DEFAULTS=1 — use env/openbrevya-ollama.sh or set CLAUDE_CODE_USE_OPENAI, OPENAI_BASE_URL, and OPENAI_MODEL manually (same values as the script).

Setup Guides

Ollama-only path (Portuguese): README-Ollama.md — same story in PT: Ollama, modelo, openbrevya; Brevya trata do resto (exports opcionais).

Beginner-friendly guides:

Advanced and source-build guides:

Supported Providers

| Provider | Setup Path | Notes | | --- | --- | --- | | OpenAI-compatible | /provider or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and other compatible /v1 servers | | Gemini | /provider or env vars | Supports API key, access token, or local ADC workflow on current main | | GitHub Models | /onboard-github | Interactive onboarding with saved credentials | | Codex | /provider | Uses existing Codex credentials when available | | Ollama | /provider or env vars | Local inference with no API key | | Atomic Chat | advanced setup | Local Apple Silicon backend | | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |

What Works

  • Tool-driven coding workflows: Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
  • Streaming responses: Real-time token output and tool progress
  • Tool calling: Multi-step tool loops with model calls, tool execution, and follow-up responses
  • Images: URL and base64 image inputs for providers that support vision
  • Provider profiles: Guided setup plus saved .OpenBrevya-profile.json support (filename unchanged for compatibility)
  • Local and remote model backends: Cloud APIs, local servers, and Apple Silicon local inference

Provider Notes

Open Brevya supports multiple providers, but behavior is not identical across all of them.

  • Anthropic-specific features may not exist on other providers
  • Tool quality depends heavily on the selected model
  • Smaller local models can struggle with long multi-step tool flows
  • Some providers impose lower output caps than the CLI defaults, and Open Brevya adapts where possible

For best results, use models with strong tool/function calling support.

Agent Routing

Open Brevya can route different agents to different models through settings-based routing. This is useful for cost optimization or splitting work by model strength.

Add to ~/.claude/settings.json:

{
  "agentModels": {
    "deepseek-chat": {
      "base_url": "https://api.deepseek.com/v1",
      "api_key": "sk-your-key"
    },
    "gpt-4o": {
      "base_url": "https://api.openai.com/v1",
      "api_key": "sk-your-key"
    }
  },
  "agentRouting": {
    "Explore": "deepseek-chat",
    "Plan": "gpt-4o",
    "general-purpose": "gpt-4o",
    "frontend-dev": "deepseek-chat",
    "default": "gpt-4o"
  }
}

When no routing match is found, the global provider remains the fallback.

Note: api_key values in settings.json are stored in plaintext. Keep this file private and do not commit it to version control.

Web Search and Fetch

By default, WebSearch works on non-Anthropic models using DuckDuckGo. This gives GPT-4o, DeepSeek, Gemini, Ollama, and other OpenAI-compatible providers a free web search path out of the box.

Note: DuckDuckGo fallback works by scraping search results and may be rate-limited, blocked, or subject to DuckDuckGo's Terms of Service. If you want a more reliable supported option, configure Firecrawl.

For Anthropic-native backends and Codex responses, Open Brevya keeps the native provider web search behavior.

WebFetch works, but its basic HTTP plus HTML-to-markdown path can still fail on JavaScript-rendered sites or sites that block plain HTTP requests.

Set a Firecrawl API key if you want Firecrawl-powered search/fetch behavior:

export FIRECRAWL_API_KEY=your-key-here

With Firecrawl enabled:

  • WebSearch can use Firecrawl's search API while DuckDuckGo remains the default free path for non-Claude models
  • WebFetch uses Firecrawl's scrape endpoint instead of raw HTTP, handling JS-rendered pages correctly

Free tier at firecrawl.dev includes 500 credits. The key is optional.

Source Build And Local Development

bun install
bun run build
openbrevya

Alternatively, without linking the global-style command:

node dist/cli.mjs

Helpful commands:

  • bun run dev
  • bun run smoke
  • bun run doctor:runtime
  • bun run verify:privacy
  • focused bun test ... runs for the areas you touch

Repository Structure

  • src/ - core CLI/runtime
  • env/ - optional shell snippets (e.g. pinned Ollama + OpenAI-compat env vars)
  • scripts/ - build, verification, and maintenance scripts
  • docs/ - setup, contributor, and project documentation
  • python/ - standalone Python helpers and their tests
  • vscode-extension/OpenBrevya-vscode/ - VS Code extension
  • .github/ - repo automation, templates, and CI configuration
  • bin/ - CLI launcher entrypoints (openbrevya)

VS Code Extension

The repo includes a VS Code extension in vscode-extension/OpenBrevya-vscode for launch integration, provider-aware control-center UI, and theme support. In this fork, the default integrated-terminal command is openbrevya.

Security

If you believe you found a security issue, see SECURITY.md.

Community

Contributing

Contributions are welcome.

For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:

  • bun run build
  • bun run smoke
  • focused bun test ... runs for touched areas

Disclaimer

Open Brevya is a community fork and builds on OpenBrevya. It is not affiliated with, endorsed by, or sponsored by Anthropic.

The project originated from the Claude Code codebase and has been substantially modified upstream to support multiple providers and open use. "Claude" and "Claude Code" are trademarks of Anthropic PBC. See LICENSE for details.

License

See LICENSE.