npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@fireplume/cli

v2.2.1

Published

Fireplume CLI — local-network coding agent powered by any LLM

Readme

OpenClaude

OpenClaude is an open-source coding-agent CLI that works with more than one model provider.

Use OpenAI-compatible APIs, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported backends while keeping the same terminal-first workflow: prompts, tools, agents, MCP, slash commands, and streaming output.

Why OpenClaude

  • Use one CLI across cloud and local model providers
  • Save provider profiles inside the app with /provider
  • Run locally with Ollama or Atomic Chat
  • Keep core coding-agent workflows: bash, file tools, grep, glob, agents, tasks, MCP, and web tools

Quick Start

Install

npm install -g @gitlawb/openclaude

If the npm install path later reports ripgrep not found, install ripgrep system-wide and confirm rg --version works in the same terminal before starting OpenClaude.

Start

openclaude

Inside OpenClaude:

  • run /provider for guided setup of OpenAI-compatible, Gemini, Ollama, or Codex profiles
  • run /onboard-github for GitHub Models setup

Fastest OpenAI setup

macOS / Linux:

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_MODEL=gpt-4o

openclaude

Windows PowerShell:

$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"

openclaude

Fastest local Ollama setup

macOS / Linux:

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=qwen2.5-coder:7b

openclaude

Windows PowerShell:

$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="qwen2.5-coder:7b"

openclaude

Setup Guides

Beginner-friendly guides:

Advanced and source-build guides:


Supported Providers

| Provider | Setup Path | Notes | | --- | --- | --- | | OpenAI-compatible | /provider or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and compatible local /v1 servers | | Gemini | /provider or env vars | Google Gemini support through the runtime provider layer | | GitHub Models | /onboard-github | Interactive onboarding with saved credentials | | Codex | /provider | Uses existing Codex credentials when available | | Ollama | /provider or env vars | Local inference with no API key | | Atomic Chat | advanced setup | Local Apple Silicon backend | | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |


What Works

  • Tool-driven coding workflows Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
  • Streaming responses Real-time token output and tool progress
  • Tool calling Multi-step tool loops with model calls, tool execution, and follow-up responses
  • Images URL and base64 image inputs for providers that support vision
  • Provider profiles Guided setup plus saved .openclaude-profile.json support
  • Local and remote model backends Cloud APIs, local servers, and Apple Silicon local inference

Provider Notes

OpenClaude supports multiple providers, but behavior is not identical across all of them.

  • Anthropic-specific features may not exist on other providers
  • Tool quality depends heavily on the selected model
  • Smaller local models can struggle with long multi-step tool flows
  • Some providers impose lower output caps than the CLI defaults, and OpenClaude adapts where possible

For best results, use models with strong tool/function calling support.


Agent Routing

Route different agents to different AI providers within the same session. Useful for cost optimization (cheap model for code review, powerful model for complex coding) or leveraging model strengths.

Configuration

Add to ~/.claude/settings.json:

{
  "agentModels": {
    "deepseek-chat": {
      "base_url": "https://api.deepseek.com/v1",
      "api_key": "sk-your-key"
    },
    "gpt-4o": {
      "base_url": "https://api.openai.com/v1",
      "api_key": "sk-your-key"
    }
  },
  "agentRouting": {
    "Explore": "deepseek-chat",
    "Plan": "gpt-4o",
    "general-purpose": "gpt-4o",
    "frontend-dev": "deepseek-chat",
    "default": "gpt-4o"
  }
}

How It Works

  • agentModels: Maps model names to OpenAI-compatible API endpoints
  • agentRouting: Maps agent types or team member names to model names
  • Priority: name > subagent_type > "default" > global provider
  • Matching: Case-insensitive, hyphen/underscore equivalent (general-purpose = general_purpose)
  • Teams: Team members are routed by their name — no extra config needed

When no routing match is found, the global provider (env vars) is used as fallback.

Note: api_key values in settings.json are stored in plaintext. Keep this file private and do not commit it to version control.


Web Search and Fetch

WebFetch works out of the box.

WebSearch and richer JS-aware fetching work best with a Firecrawl API key:

export FIRECRAWL_API_KEY=your-key-here

With Firecrawl enabled:

  • WebSearch is available across more provider setups
  • WebFetch can handle JavaScript-rendered pages more reliably

Firecrawl is optional. Without it, OpenClaude falls back to the built-in behavior.


Source Build

bun install
bun run build
node dist/cli.mjs

Helpful commands:

  • bun run dev
  • bun run smoke
  • bun run doctor:runtime

VS Code Extension

The repo includes a VS Code extension in vscode-extension/openclaude-vscode for OpenClaude launch integration and theme support.


Security

If you believe you found a security issue, see SECURITY.md.


Contributing

Contributions are welcome.

For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:

  • bun run build
  • bun run smoke
  • focused bun test ... runs for touched areas

Disclaimer

OpenClaude is an independent community project and is not affiliated with, endorsed by, or sponsored by Anthropic.

"Claude" and "Claude Code" are trademarks of Anthropic.


License

MIT